Lecture-2 Reinforcement Learning¶

From Monte Carlo Methods to Proximial Policy Optimization¶

Minchiuan, May 2023

image.png

Outlines:¶

  1. Why we need reinforcement learning
+ 1.1. real cases need
+ 1.2. RL's differences with supervised learning

  1. Markov Decision Process
  1. First Step Monte Carlo Methods
  1. Temperal-Difference Learning
  1. Deep Reinforcement Learning
  1. Deep Q Learning
  1. Policy Gradient
  1. Actor-Critic
  1. Trust Region Policy Optimization (TRPO)
  1. Proximal Policy Optmization

References:

  1. Reinforcemenet An Introduction: link, in our gruop file "课前预习资料"
  2. CS234, Stanford, https://web.stanford.edu/class/cs234/
  3. CS285, Berkeley,https://rail.eecs.berkeley.edu/deeprlcourse/

In the previous lesson, we've learned Generative Pretrained Transformer. By this method, we can project different words to different vectors explicitly.

Actually, "GPT" model maximum the $argmax_{\theta}Pr(w_n | w_1, w_2, \dots, w_{n-1} ; \theta)$

e.g-1

Give two words, $w_1, w_2$ (今天晚上), which probability of the fellowing will be bigger:

  1. $str(w_1, w_2, w_x, w_y)$ -> "今天 晚上 吃 什么"
$$ Pr(w_y | w_1, w_2, w_x) Pr(w_x | w_1, w_2) Pr(w_2 | w1)$$
  1. $str(w_1, w_2, w_a, w_b, w_c, w_d, w_e, ..., w_z)$ -> “今天 晚上 咱 去 玩 点 不一样的 要不 就去 水立方 游泳吧”
$$ Pr(w_z | w_y, w_x, ... w_1) Pr(w_x | w_1, w_2) \cdots Pr(w_2 | w1)$$

the probability of sentence 1 will bigger than sentence2. Therefore, given the $w_1, w_2$ as "今天 晚上", this model will generate "$w_x : 吃$"

But for a generation task, the sentence 2 may be better.

Therefore, we need model can produce some results that may not be the model initially generated.

However, in order to generate some more readable sentences, how to let some model know if its result is good or not?

Mathematically, in the previous training task, we've defined a model that generates next token by given so many labels.

$$ \text{dataset} = \begin{bmatrix} x: (w_i, w_{i+1}, .., w_{i+k}), y: w_{i+k+1} \\ x: (w_j, w_{j+1}, .., w_{j+k}), y: w_{j+k+1} \\ x: (w_m, w_{m+1}, .., w_{m+k}), y: w_{m+k+1} \\ x: (w_n, w_{n+1}, .., w_{n+k}), y: w_{n+k+1} \\ \end{bmatrix}$$

By define the loss, such as cross entropy, we can get some model to generate tokens.

however, by the analysis of e.g-1, the goal we need is not just for we get the best assumption in each step, but we need a accumulated better result

which means, maybe in each step, we do not choose the best next action, but as a long term vision, we get some better result.

If we define our question, it like that: $$ \mathbf{Tokens_{gen}} = \begin{pmatrix} step_i, f(x_{i, .. i + n}; \theta) \rightarrow token_{i+n+1} \mid \theta \\ step_{i+1}, f(x_{i+1, .. i + n+1}; \theta) \rightarrow token_{i+n+2} \mid \theta \\ step_{i+x}, f(x_{i+x, .. i + n+x}; \theta) \rightarrow token_{i+n+x+1} \mid \theta \\ step_{i+n}, f(x_{i+k, .. i + n+k}; \theta) \rightarrow token_{i+n+k+1} \mid \theta \\ \end{pmatrix}$$

we need the accumulated result of $$token_{i+n+1}, token_{i+n+2}, token_{i+n+x}, token_{i+n+k}$$ to be the best.

reward, goal, value¶

therefore, in order to evaluated the each step's performance and the final result, we add a mark reward on each step:

$$ \mathbf{Tokens_{gen}} = \begin{pmatrix} step_i, f(x_{i, .. i + n}; \theta) \rightarrow token_{i+n+1} \mid \theta, Reward_i \\ step_{i+1}, f(x_{i+1, .. i + n+1}; \theta) \rightarrow token_{i+n+2} \mid \theta, Reward_{i+1} \\ step_{i+x}, f(x_{i+x, .. i + n+x}; \theta) \rightarrow token_{i+n+x+1} \mid \theta, Reward_{i+x} \\ step_{i+n}, f(x_{i+k, .. i + n+k}; \theta) \rightarrow token_{i+n+k+1} \mid \theta, Reward_{i+n} \\ \end{pmatrix}$$

Mathematically, we are updating $\theta$ to get the biggest accumulated reward.

$$ argmax(\sum_{i} Reward_i \mid \theta) $$

long-term object, sequential series learning

decision making¶

self-playing learning

We call this as our object of learning.

some more situations

1. Robotics¶

2. Self-Driving Car¶

3. Trending¶

4. Game Playing¶

Some common terminologies in Reinforcement Learning.¶

agent: the entity to make some decisions, in AI algorithm, it usually means our training model;

(multi-agents: some tasks that need more than one agent to achieve the task, MARL)

state: the representation of the current agent preceived.

each step, i, $S_i$, next state: $S_{i+1}, S'$, $\mathcal{S}$, which is the collectoin of all possible states. $S_i \in \mathcal{S}$

terminal state: the final state, $ S_T $

environment: something outside of this model or agent, environment will give the reward and next-state, $\mathbf{Model}$

action: based on the current "state", the decision that agent has made. $a \in \mathcal{A}$

reward: a numerical representation of by the current action and state, $Reward(s, a) $, $r, R, reward$

transtion-probability: $$ Pr(S_{t+1} | S_t, a_t : \theta) $$

value: in a state, The expectation of the future reward's sum, $$V(s_i)$$

Practice: please describe the state, action, reward in above examples.¶

In this simplest situtation, $$ V(s_i) = R({s_i}) + R(s_{i+1}) + R(s_{i+2}), ... , + R(s_{T}), \\f(s_i) = a_i, environment(s_i, a_i) \rightarrow s_{i+1} $$

Therefore, our learning object is to maximize the $ \mathbf{Value}(S_0)$.

In order to make our model learn mode efficiently, we let them focus on the current reward more than future's reward a little.

Value with discount, $$ V(s_i) = R(s_i) + \gamma R(s_{i+1}) + \gamma ^2 R(s_{i+2}), ... , + \gamma ^ {T - i} R(s_{T}), \gamma \in (0, 1)$$

Dynamic programming character in RL , Bellman Equation

$$ V(s_i) = R(s_i) + \gamma V(s_{i+1}) $$

But for the real situatoin, $$ Pr(s_{t+1} | s_t, a_t) \neq 1$$

$$ V(s_i) = \sum R(s_i, a_i) * Pr(a_i | s_i: \theta) + \sum_{s_t \in |S|} \sum_{a \in |A|}(Vs_t * Pr(S_{t+1}|S_t, a)) * Pr(a | s_t)$$

assuming $\exists v(s: \theta) \rightarrow \mathcal{R}$, in order to get the function parameters' $\theta$

most straightforward: $v(s)$, we get all the future $reward_i$

a smart way: $s_t, a_t, s_{t+1}, r_t$

$$ v_{new}{s_t} = r_t + \gamma v(s_{t+1} : \theta) $$$$ \mathcal{L(\theta)} = (v_{new}s_t - v(s_t))^2$$

Policy Function:¶

$$ \mathbf{\pi(a | s)} = policy(a | s_i : \theta) = Pr(a|s_i : \theta)$$$$ \mathbf{\pi(s)} = a, a \in \mathcal{A}$$

Therefore, our target in refinforcement learning is to find some policies can maximize the $Value(S_0), S_0 \in \mathcal{S}$

Q-Value: quality-value, the quality of some action¶

Q-value: the specific value in some state ($S_i$)and on action($a_i$)

$$ Q(s_i, a_i) = R(s_i, a_i) + \sum_{s_t \in |S|} \sum_{a \in |A|}(Vs_t * Pr(S_{t+1}|S_t, a)) * \pi(a | s_t)$$$$ Value(s_i) = \sum_{a \in |A|} Q(s_i, a_i) * \pi(a | s_i)$$

Therefore, we can by finding the action which can make the biggest Q-value to learn how to make decision in each step.

Q(s, a) and V(s) make the learning process more efficient¶

$Q(s, a; \theta)$, some Q values may be very high, agent/model believe this action is very good action, then, the paramters will update to this direction with a big step

$V(s)$, if V(s) is a very big number too.

advanrage value

$$ A(s, a) = Q(s, a) - V(s) $$

Markov Properties¶

Image the fellowing two situtaions:

For a dirving car problem, the state is the vision surrounding some one has seen and the car's monitor information, such as engine round, gas amount, etc.

Goal is to reach the desitination as quick as possible.¶

$$ reward(state_t) = -1 $$

We are training a model, which are maximize the $\sum{R_i}$

the next state only depends on the current state and environment random/stochastic variables

But, if this car suddenly remembers that there is a place it should have vistited but did not visit this place.

Then, this car will change its route and visit that place firstly.

In this sitation, the next state does not only depend on the current state, but only depends on memory.

Markov Chain:¶

If some a sequence of events or states where the probability of transitioning to a future state depends only on the current state and Env, and is independent of the past history.

In a Markov chain, the system evolves through a series of discrete time steps, moving from one state to another. The key components of a Markov chain are:

States: A set of possible states that the system can occupy. Each state represents a distinct condition or configuration.

Transition probabilities: For each pair of states, there is a probability associated with transitioning from one state to another in the next time step. These transition probabilities are often represented using a transition matrix or transition probability function.

Initial state probability distribution: The probability distribution that determines the initial state of the system. It represents the likelihood of starting from each possible state.

Markov chains are often referred to as "memoryless" systems

"Markov Decision Process" (MDP)¶

States: A set of possible states in the system. Each state represents a specific condition or situation in which the agent can find itself.

Actions: A set of possible actions that the agent can take in each state. The agent selects an action based on its current state.

Transition Probabilities: For each state-action pair, there is a probability distribution that defines the likelihood of transitioning from the current state to a subsequent state based on the chosen action.

Rewards: A numerical reward associated with transitioning from one state to another due to the chosen action. The reward captures the desirability or value of reaching a particular state.

Discount Factor: A discount factor between 0 and 1, denoted as γ, which determines the importance of future rewards. It influences the agent's preference for immediate rewards versus long-term rewards.

Initial state probability distribution: The probability distribution that determines the initial state of the system. It represents the likelihood of starting from each possible state.

image.png

image.png

Early Trials to solve Reinforcement Learning¶

Monte Carlo Methods¶

  • task
  • intitial, $s_0$
  • intital policy $\pi(s)$, random initialized
q_table = {
    state: [a1_v, a2_v, a3_, .. .. ]
}

TRAIN, EVALUATE = 'train', 'evaluate'
mode = 'train' 

def policy(s): 
    if mode == TRAIN:
        epsilon = 0.05
        if random.random() < epsilon: 
            return random.choice(range(len(q_table[s])))
        else:
            return np.argmax(q_table[s]) 
    elif mode == EVALUATE: 
        return np.argmax(q_table[s]) 

state = env.reset() # initial state
done = False

value_table = dict(list)


trajectories = []

for _ in range(10000):
    while not done: 
        action = policy(state) 
        next_state, reward, done = env.step(action)

        state = next_state

        trajectories.append((s, a, reward, s_next))

goal = 0
gamma = 0.99

q_table_s_a_count = defaultdict(int) 

for _s, _a, _r, _s_n in trjectories[::-1]: 
    goal  = _r + gamma * goal
    q_table[_s][_a] += goal
    q_table_s_a_count[(_s, _a)] += 1

for _s, actions in q_table.items():
    for a in actions: 
        q_table[_s][a] = q_table[_s][a] / q_table_s_a_count[(_s, _a)]

image.png

image.png

In [1]:
import gymnasium as gym
env = gym.make('Blackjack-v1', natural=False, sab=False, render_mode='rgb_array')
/Users/gaominquan/anaconda3/lib/python3.7/site-packages/pandas/compat/_optional.py:138: UserWarning: Pandas requires version '2.7.0' or newer of 'numexpr' (version '2.6.9' currently installed).
  warnings.warn(msg, UserWarning)
In [3]:
import matplotlib.pyplot as plt
Bad key text.latex.preview in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 123 ('text.latex.preview : False')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key mathtext.fallback_to_cm in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 155 ('mathtext.fallback_to_cm : True  # When True, use symbols from the Computer Modern')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key savefig.jpeg_quality in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 418 ('savefig.jpeg_quality: 95       # when a jpeg is saved, the default quality parameter.')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key savefig.frameon in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 421 ('savefig.frameon : True')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key verbose.level in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 472 ('verbose.level  : silent      # one of silent, helpful, debug, debug-annoying')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key verbose.fileo in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 473 ('verbose.fileo  : sys.stdout  # a log filename, sys.stdout or sys.stderr')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key keymap.all_axes in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 490 ('keymap.all_axes : a                 # enable all axes')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key animation.avconv_path in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 501 ('animation.avconv_path: avconv     # Path to avconv binary. Without full path')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution

Bad key animation.avconv_args in file /Users/gaominquan/anaconda3/lib/python3.7/site-packages/matplotlib/mpl-data/stylelib/_classic_test.mplstyle, line 503 ('animation.avconv_args:            # Additional arguments to pass to avconv')
You probably need to get an updated matplotlibrc file from
https://github.com/matplotlib/matplotlib/blob/v3.5.3/matplotlibrc.template
or from the matplotlib source distribution
In [2]:
env.reset()
Out[2]:
((16, 4, False), {})
In [4]:
plt.imshow(env.render())
Out[4]:
<matplotlib.image.AxesImage at 0x7f7982d63c50>
<rdf:RDF xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:cc="http://creativecommons.org/ns#" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <cc:Work> <dc:type rdf:resource="http://purl.org/dc/dcmitype/StillImage"/> <dc:date>2023-05-13T16:09:09.086915</dc:date> <dc:format>image/svg+xml</dc:format> <dc:creator> <cc:Agent> <dc:title>Matplotlib v3.5.3, https://matplotlib.org/</dc:title> </cc:Agent> </dc:creator> </cc:Work> </rdf:RDF> <style type="text/css">*{stroke-linejoin: round; stroke-linecap: butt}</style> <image xlink:href="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAbwAAAFyCAYAAACOUm8bAACENElEQVR4nOzdd5xV9YH//9fnnHN7n14ow9BRBEXUGDXGHktiYvoaU0wzMU3d7GY3m7j5bftm08uuKZq4Gk2sxK5YUVEUBFR6m2GYdu/M7f2Uz++PO4wgoBhRkPt5+uDhzLmnfO5n4L7nc86nCM+l0ySKoiiKcpjTDnYBFEVRFOXtoAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6oAJPURRFqQsq8BRFUZS6YBzsAijK260l3EQsEHlLzh3PjpAqZPZr35A3QGukGV3T3/R1K2aF/tQQpm296XMpyuFKBZ5Sd/7h3C/z1TM+/dac+5b/5FcP/xHbcV533zOOOJlffeoHNAZjb/q6q7av5e+u+Tpb4r1v+lyKcrhStzQVRVGUuqACT1EURakLKvAURVGUuqACT1EURakLKvAURVGUuqACT1EURakLwnPpNHmwC6Eob6czjziZY7vn7fW1kCfAR48/nwmxtr2+3jc6wB+fvg1H7v2fzSNrnuL5bauRY69f9b4v4Xd797qvRGI7DgfiH6AAdE1H7OP11X3r+OsLDx2AKynKO5cah6fUncVrnuThtU/t9bW2SDMnzVi4z8DbkRrih/f+L5Zj7/V1uUsQCiH45lmfoyEQ3eu+d77wIF+/8fskC+k3VP69mTdxNjd+6ed0N0/a6+s3PnOnCjyl7qnAU+qS3EcLrbZ9320uCThS7vP4VxNCoGl7f3IgEMg3cK7XIpH7vJaUtdcUpd6pZ3iKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQF1WlFUQ4Sr8tDc7gRQ3/z/wzD3hCpQpbhzMheXpVkirk3fQ1FeadTgacoB8mCrrn84u+upvoaa9jt2rdSjn2/6/93SmRH+dmDvyddzO52/M59BtPDB6TMivJOpgJPUQ6S1kgTrZGmA3KuVdvX8u93/1Kth6cor0E9w1MURVHqggo8RVEUpS6owFMURVHqggo8RVEUpS6owFMURVHqguqlqSgHSbqYpS85iL2PlRfeiI1DW6mY1QNQKkU5fKnAU5SD5MmNz/Htv/wHyULmTZ/LdmyKldIBKJWiHL5U4CnKQWJaFplijsyrBosrivLWUM/wFEVRlLqgAk9RFEWpCyrwFEVRlLqgAk9RFEWpCyrwFEVRlLqgemkqdee02SdyTNfcvb4W8gZoj7YekOtIKfnVw9fjd/v2+rppW1x6yseRuy30U2M5Nk9vfJ7nt60+IGVRFEUFnlKHzpt3Gl85/ZK35Vr/ec+v9/naBxecw68+9QMaAtE9XqtYVX6w6Ocq8BTlAFKBp9QlIcTr7/Q2OZTKoiiHM/UMT1EURakLKvAURVGUuqACT1EURakLKvAURVGUuqA6rSh1p2xVyZbye31NEwKf24uu6W95OSzbIl8u4NJde7xWtaoIIQh5g3/buR2LUrX8ZouoKIcV4bl02p6DgBTlMDattYsJsfa9vtYQjPK9D3yDWe1T9/r60s0vcOYPP3lA1rBrDjUws20qhr7n752GrnPSjIUc3z3/bzr3c1tX86P7f0OuXHiTpVSUw4dq4Sl1Z/NwD5uHe/b6WlukmWwp97aUI5FLksgl9/qax3Dz3tkn8t7ZJ/5N5y5VK3sNUkWpZ+oZnqIoilIXVOApiqIodUEFnqIoilIXVOApiqIodUEFnqIoilIXVDcuRdmFaZusHdgM7H1C5/UDm5HyrR/J40jJjuQAz23921ZL2DS8Dct+80MnFOVwosbhKcouNKHRGIzhNvYcDA5QtUwSudG3pSwRX4igN/A3HVs2KyQL6bclnBXlnUK18BRlF4503rZAez2ZUo7M2zQmUFHqgXqGpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQFFXiKoihKXVCBpyiKotQF42AX4K3UGm6iNdL8mvskC2n6U0NIKd+mUimKoigHw2EdeF889ZP80wWXv+Y+1z15C1f9+d8oVctvU6kURVGUg+GwDjwAIcRrv85rv64oiqIcHg5Y4EV8ISK+EACZUo5MKfeGzxH2BYn6wgBky3nSxez4a5oQtEdb0cXrP3YsVEuM5lNkSjl6R/tfc9+KVWVCrJ2KWaFQLZHMp9j15mbQGyDmD79mMBbNMqP5FH63j8ZAFIB8pUiqkEGibpUqiqIcCoTn0mkH5BP5X97/df75/V8D4L/u/R/+/a5fYdrmGzrHd87/Kt+/8JsA/PiB3/Gvd/6U6tg5GgJRVv3bA7SEGl/3PLc9fx+f/f2VmLb1uvt+9uSP8uOPfxe/x8etz93LF//wj5TMV25vfuX0S/iPD38br8uzz3Pcs+oRLr3277lwwdn85jP/CcANS+/g6zd+X90qVRRFOUSoXpqKoihKXTikn+FpmkZ3wySgdrtzx+gguVJ+r/uGvUGaQg37fGYX9YdpGLvdOE5Ac7hxj2MivhCNwRgALt1gW6IPt+HabZ+gJ0BTqAFdU78zKIqivBMc0oEX8Ph54p9uQdd00sUsF/zscyTz6b3ue/GJH+LfLroKj+He6+uXnPRhrjz7C7iM3d+yx3Dvcbvy4ye8f/zW6p+eWcT7fnwJFau62z4fOvZ9/OeHv0147LmloiiKcmg7pANPCEHMH8XQdaSUZIs5UoXMXvctVIq8Vv8Qn8tDLBDZo6W2N16XZ7w1qAmNVCGzR+A9uWEZ/3TbD3EbbnoSfbs991MURVEOPYd04DmOw+bhbWiaTq6cZ1LjBKL+8F73bQ037XFrsiEQpTEYBWrhuSXeg67t/pYjvhDN4Qa0ffT+jPrDTGvtovqqwMuWCvzxqdtwHAeJVAPXFUVRDnGHdOAVqyUu/MUXAAj7Qvz2s/9JxLf3wAt5g7j03d/OJ991IV89/RIA/rzsLj7y669gvarn5keOO59/Ov+r+NzevZ73gvmn8+7px+4RaHevepgf/PXn5MuFv+m9KYqiKG+vQzrwHMehZ2QHUGutdcTa9mtYwk4xf5gpzROBWguvd6R/j6ESI7nka7bOwr7QXp/TnT7n3TjSoWJW2TC4hduX37/f5VIURVHefm9J4DUEokxr7cJ6jXF4uXKB4czIfg/MtqXD1vh20oXsXl+P+EO0hPbscflmpQsZEvnUHqHYEIjy1dMuQdM07l39KPesfnS318PeIFNbJrNpaNtuz/8CHj8toUYSuVHyleIBLauiKIqyb29J4F14zNm8a9qC12w5LV7zJD9Y9LM9OoPsS75c4LLr/wlD23uRP7jgbK5635f2q1PKG/Hgy0v4xeI/ULV2D+9zjjqVb5/7ZULeAJrQ9piH5ZSZx3Pdpf/Np393BesGNo9vXzjlKK5635f4yYO/49G1Sw9oWRVFUZR9e0sCryXcSEv4tW89bhjcss+OIju5XW6+esan0faj1WY5Nr98+A8AvNS3Adtx9r/Ar7Ji24v8+IHfAlA2K5w66128ugvo3Akz93hmuKtYIELEH+Lz7/kEA6mh8e1zOmdwTNeR49OwKYqiKG+PtyTwkoUMqUL6NVt4hUqRKc0TqVhVitUSQ+n4Hvt4XR7+9YNXYGj6617zD0/dwpU3/1utxSh5U3NYPrVpOU9vXgHAZad9in+76Cq8xqumFhOvPfF0tpRnJJfkcyd/dLexgYVqiYHUsLqdqSiK8jZ7SwLv3lWP8MenbsWy7X3uM2/SbH5+8dW4dBfPbV3FP9/233vdTxMa2n7NZiJwpHPAhgfsep79L8Mrlm5ezs8evI6fX3w1M9u6x7ev6l3DD++7hhU9Lx2QciqKoij75y0JvP70EM9tXf2ak0dPbOxgQddc/G4f2VJur7ctq5bJtUv+/Lq3PqEWMK++jbmi92V+9/jNACzf9hKO3PM254bBLfzxqVtx6S5e6H0Zy9l92MKaHRv4w5O37PPZIcBLO9ZTsapsHNw6fr1lW1fxzOYV/GnpnUxs6Bjf98W+dTyzeYVq4SmKorzN3pLAi/hCTGrseM3VCvxuLzuSg3gMD/Hs6F5vQZbNMt/409V/cznuW/0o972q9+SrLd28gqVjty/35okNy3hiw7L9ut4zW17gmS0v7Lbth/dds1/HKoqiKG+ttyTwzpt3Gkd2zsR5jedoa/s38c2bfoBlWyQL6b2Ho5q8RFEURTlA3pLAm9TYyaTGztfcJ54Z4ZlNK9QclIqiKMrb4oAFXr5SZDgzst/7Z0q5PW5jFnY5R75cUA08RVEU5YA5YIH31xce4qW+9UDtTuSru6C8ettQNrHHYO67Vz7M2v5NSKB3dMceHUgURVEU5W8lPJdOUw0pRVEU5bCnlutWFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oIKPEVRFKUuqMBTFEVR6oJxsAugHJq0qsAoug52MQ45ZtBEGvJgF2OcXtLRKzoAUpMIB0Dsc//d95Fj/9v3/nsQY+997BipOQhH7HbNV5djf8r1qlLuVi4p5Nhl9/99VSPm/l9OqRsq8JS98qQ9NL4cO9jFOOTEF4zUPkwPEYFBH+GeEACm10KvamjOvm/cmF4Lo6wjEEgkjstBN/X9vp7tskGCbtWOqfpNXEUDsUu6WF4LzdTQ7Fo5TL+FUdR32+e1SCFxdGf8GpbbRrM0NGffx1f9Fq6SjpC19zVw8jDSdej8YqIcGlTgKQAIG5pWNcLYZ8Tx8+dzxQ2XHtxCHYK++6Mfs2l5DwBSl4zMT76tLQmjoBNbFx3//qPnncdF/985AAzk4jT4IngNzz6PH8zFaQk2oQsNCaTLGWLeyH5fP1PJIdAIewIA9Gb6mRBuRxevhOxwYYSQO4jf5QWgLzvIxHD7fl/DdCwqdpWgyw/AaClN0O3Ho7v3ecz2zAAdoVYMTceRks///T9QKJVq5wuapGdl9/v6yuFLBV6dE5bAlTfQpcExU45EG/v0PnrGEcw9atZBLt2h55hpRxIRYy0qLJ5MP4cUkmrIhP1vKL1hWkXDKBpERJATZx0zvn3e7NnMmD0FAF/KS2Mght/t3ed5/Ckf7ZEWdK0WeKOFFE2B/W/Jp4pZhBBEfbU60BI6XY0TMLRXAi+QDhDxhQh6fAC4Rlx0NU3Y798LqrZJ2awS9tZCNZ5LEvIG8Ln2HeRGwmBiQzsu3cBxJCfMOppSqQxAUmZYllqF1CRmWN3qrGcq8OqcUdRpWBPDb/j44aJ/xDBqn9oe974/XOrZ1775GSyrdkszmyvw4Q9dhq3bJBaMYuvOW3ZdT8pNdFOYmdOm8tM/fG98uxBgOTYAtuNgO/b493tjSwfLsZFI5Ngxr7X/Hsc7NkKI8WMcWbvm+K0BwJH2buV4ZZ/9Yzm7H2+/6vvXel9C1NLsP3747fEirV69jq9+5V+wvDbxhSMq8OqYCrw6JiyBS7qIBkIEPQFaGhoxDPVX4rU0hF+5/ec1vDSEolSFScrO4FhvQYcWWfs5eYSHaDBCLBgm5AvsdVeP4cbn9hEYa1ntfR8XfrcXQzeQUlKslgh4/PtdnLJZQRPa+DFu3YXf7cPQX2neegwPPrf3lX2M2j47w+j1GFZ1t2vkygX8bh++12i5esau4Tb27GgV9gWIhSJU3SYjVhJHd97S1rhy6FKfbnWs8aUYc1tn8rNbvkdbY/N+fyApNaFQgIcfvZG+4UG++JnvsI0+UnMyB/w6bcuaOfO9J/Hda75GJBg+4Oc/3M09ahaPPnYT24cH+PD5X2G0K0V+UuFgF0s5CFTgKQgh0DQ1JHNX0pEM3rMYI+gntnA+rlAQgPhjSzGTKcJzZxOa0V2ru7f6FwUJgtp1NG3PazmWxeDdixkujmK5gvj0fQ8nGSqnEJ7IeKeVZDWP6Q7ud1FSZhFNCIpGrRU5XEzg8zXu1mllqJKmZPjI6rXb4sOlUby+xv1/hufYVByTvFFr0cWrOQqGF6+27/c1XBzB5WsgOq2b6PwjdntNCFH7o4ld77wqdUgFnqLslSS1fDXuxhiRubNhLPCyazZQ2t6Pu6mR0Izug1zGGmk7JJetJG3nEHjwiX3/s87IAh58tcCTkjRlhNj3LdBXS1NBQ2DjHjtfliTB3QIvLQvYuKkI19j3eUIE9vsOgiltKtiYwj12fAkLF97XfF85/ARwG649Ak9RdlKBV28k+OJehCN49zHHcvTUOfg8+342orw+v8/HGWecxKoda3hqcDmllhLyTT4jErbAF6/9XM4+8xTmHTUbTd/zpKWBIUr9Q0SPPpJCOUXUFXjNFl6hnCI61sJDgjTzRN9ACw+ziBCCyFgLL1xMEH1VC69UyRA2fATGhhGkS6PEvI373VnEdGzKjklorIVnVnOEdC/e13hf6dIoMU8MzeshteJFIvOOQDN2ry+XYXDe+e/lufhq1gxvpNRSVh1Y6owKvDoU2xBBszS+8K2Pc/zx8w92cd4RpNz3vbDGSIy///svcs/dj7Dmu5soN1SQb7LHpmZqxNZFEAj+8deX4Q160Xe57byzPNmXNzCy5Fnm/OtV2KlBmkONr9lpxRndQWe0bbzTiiefpCXUuN/l8udTaEIjFqh13ikObWVC8+TdOq2QHCTqDxMaG1ZQifcwoXnyfrfwKlaVslkhMjb0wZVJEPGFXrPTSnl4G52NE8guW8WOW+4mPGcGUq/V187rej0erv7Xb/LTn1xL/03DtcBT6ooKPEV5DXaxRM91f8apmmPfFw9yiWr6/nQHhW19NJxwDDO+/VVQz2ABaFg4j8gRM9n8y+twKlUmXfwhAlMmHexiKYcIFXiKsg9GKIjudqF5PUh7rMXWWBuk7YqE3uKrSxy3AwIS+SQ+vGhCwz02bGQoO0Ixm8CuZIES5Epky3mEEKSTNpXhxF7PmiimMX2DaGO3IDOVAnnP3oc57E2uWkITgtTYLCqJ/Cj64Oj4+QBGShmyLh8+o3ZLc7iQRBsY3e9rWI5N1bEYHZsxJl3Jkza8uPU9P66EYeCf1EGuXCCRG8XQdSSS4dwITrmCJztCIFtr8WZLBaq2Sd4u4Lj3f1ygcvhQgaco+2LbCLefljNOxhXdffotw7//HT3+ZhKEFBiagaHp6JqOMdZT0UDDQMO1yzZd0zB0A2twiOE779/rKUepYOJGRyCBHFXK7HvKrlfLCwshITD20ZEQJZBe9F0ehiWpUMDAOzbYLSHKCOnd78dlFg4mDr6xa2So4sPAvZfFXYxwiO7LPj3+3g3NAOFgoOGgjdXdWJ3pOi7dQEN/YxNmK4cNFXiKsg9WsYTm8+IKh/A0vt0TaQs0U0MgiPnDeP21Z3jesem1MoYPl/AQdQVoDEYByObT+HJVsDSaYi1UEqNI28YVCSPcLqqJUXLSIogLdzCAEQoiyhmC6QruhiiauxZ8UjpUhkcwwkEMvx+7XMZM1+aidAwNTzRMCDfVkSRuFzQ2NGMIDYmkMpSgIByijVF8to6ZSpOUFkEkrkgY3euhEh8BwNPSiNBqoehYFtWRJK5YBEuH3OgowbHHplW3TizaiNt0MFMZ3E0NaGMtXT3gw5+rYOQrxNrCuF1upOMQER4cAQ3eEMGx+hFALBDBr3vRTHULuB6pwFOUw0Q5kSR5+8M0dXcz/YovsuH//ZrqaIrWs0/F3zWBjT/8n/F9Ywvm0nbeGbhfXk3uhnuY8NH3E+iuPeuyS2XWXv1jmk89kaaTjiO9ag19N90JgKe1iUkf+wBhzcOmH12Db0IH0y79LC63B8eyWPu9H2F4fEy6+CLcuQo91948fs2W008ifMQM1v/bzwGY8oWLx28NlwfjbPrpb+n80LnIkI/1v7keSrXbjoGuiXR/7CKcvmF6/3gLkz/9UbytTQBUU2k2/OevGIhoTP3mDHDtf2tVqT8q8JQDyszlKWzuQfN6CM2atkfPvOyajRihAP5JnW9ZGfJbepC2s8c4OWnb5NZvxqmahGZPR/e+9nyhnpZmXKHAQekQIoXEClogYFNiG/6SDyE0jLEWUX9lhKLMkM0PkhzcDMBQNk7WyTIytJmB+xZhTQojJwTJ9m1A9q4lKTPkdQfX7Gnky6P03P9XCtUi7nkT2VIZQWxMkd+8DSSYR00gG+9h030DVIZHyMoM4TkzqHp0csuewqO5sOZNpBQ02BDfhjmSotw/jHlEOwW7zMvPL4VckYzMkPMIQrMmkhvagja6HXPeRAA2pvowt+cp9Q0gbQdr3kRy61diViqMWhlcwiF85GzKumTl44vRHbDmTWRTbgDDSVPc3k+5f4hRmaEofGyMb8Pl9SKlwzYnjSOrVJPb8Q/WgrNklhnOJUiYo1jBQ2eJJ+XtowJPOaAq8RF23HI3npZGQjOn1mY33sXgPYsJTut6SwNvZMky7FJ5j8BzLIuBuxZjZXNM+9YXXjfwqokRcGxw3rpJofdFSDDyBkIIpjZNxhv0oAsdz1gLxuVuJE+O1kAbLa211RLkSIaQFsI1UoUnXmb6FV/C3Ril/7Z7Sa/cQJAwCcPihA99hOyKlxm8/1H0SU0c86VLEUKjsLWXbU+8jOb1MPufv8HQfY8w+sQK/ECUMDMv+BCJxCADt99H1Bti2jc/z6Z4D90tk0lteo740nXM/u436R3eTubP92GkCkQIM+zXOfFjf8fg3Q+RXb2Jmf/w1dp71DSSy1Yy8MTLeNuamfq1z9F7/a2kNm7BcFyEXH5mf+QTbF31Arn7ltAwpYtJn/okYuwXkP6n15F6fhMBwgxJjWnNk/H4/UhHYmsRbCpMiU0g2NoFQKqQIRqI0Gg0YBTUR189Uj915YCwSyXSq9ZS3N6PtG3MbJ74I08RmjkVb1sziSXPggSrUKTYN8jw4iUYAT+NJx57QK4vpWTkiWdxTJNKfATHshhevAShazSfeiKFrb3kN/dgF4tI2ya5dDm+iR2Ej5i59+ATgqb3vAvd70V/Ozqo7FmA2v8kaEJDEzqapqGPtfA0IdDHpjXbuU2M/edraSJy1BxyL64DoDqYwOX10vjuhdjFUVLPvIAuNNpOP4mcT8MwXKSWr6aaytB2+skAjD7xLOUdg2gSPK3NRObOIrtyDcVqgYaj59IQbcIwXFiZHMm1z6JJaH3PiSSfep5MZhinWMYbixE95kgqhTijjz+DOxik5aTj0cd6W44++RxOpfLKNR97BmskhcvnIzz/aCL+MMknn8MsZ2h+1wIibe0YhovRpcuxCkU84TDNJy5k9Onnx6Zeq9WPxKktcyUEuti1zjR0odWmglOdVuqSenKrvClSShzTxMzmGV36POkVLwJgF4rEFy+hsK0Pq1Ai/tAS4ouX4JQrlAeGiC9ewshTz+FUTeSbbEFJ28GpmiQee5r44iVUR1NYmRzxxUuIP/wUTrlCbv0WEo8+jVOpAjC6dDnJ51Zi5Qs45t5vbzWdfByxBfPQ/T6k4xyQsr4dPM2NNJ96IukVLxJfvIRKPIERCtJ86olEjz6S5LMrkKZF65nvITJvDk7VJLXiJYq9O2g+9UQaTlhA/OEnKe0YQrgMfB2ttfOtfJnitu1EFxxF44nH4lRNqqMpEkueRVoWTScfT+KxpSSXrURaFp6WWjlCM6aSePwZXJEwTaecgDQtpGkxuvR5rFyhts/s6cQXL8FMZ3FFQjSesIDGk45j5MllVEaTNJ503HhZk8tWknh8aW2/dy042NWtvIOoFp7y5jgO6//9F9iVCth7hsHQfY8w/MCjez20OpJk7fd/xKRLPkx49vS/uQip5asYWPQg0t5zbJW0LNb9fz/ba1AVt/Wx6cfX4IpGmPHtr+z+vFFK1v/HL3E3xpj4sfdj5Qts++2f6L7skrf0duyBkF27ibXf/9F4fXR88Fz8XRNY+/0fMew2Oemqb+EO1MbeFbf3s/ZPv6Pr859A2g5rv/8janOOSdrPO52GE48l+9J61n7/R0y/8kvk3BJdN6gmkmz66W/JdEU46TuXk35mJet+8NNar9CAn6mXXoKrUGXt93/EYFTj9KuvROgaVi7Phv/8FQAz/v4ychu21Mo6NnPMpE9dhAz72fjbG/Fbgln/9HXipTSax0Nuwxa2/99tTPv653A3NzKw6EH6Fz1wUOpYeWdSgae8adKy9xp2tRcl0t73tFzStuE1pu3ar+tLudew2+0a+3zNeWVQ+V6OGz9WHpiyvi1eVR9C0xCaNvZ+HISu176XcnxfITTQXlVXmlbr/i9EbR9dQ2i19y8Zu4YjEboOYpdjBaBriLHjcEAY+vgvFDv32+O4ndt2ltUSteM0rXaDd+f70vRauaTc9987RdkLFXiK8irpVWtIPL4Up1KlEh+h9/pbx1uIfbfcheZy03TK8cSOmXuQS/raJn/6I7iiYZLPrya96mWmfeNSPNlhNLeL1AsvMbJkGfmYh1nfuJTE489Q6h8CQHO7mPKlT5F5cS2bfvZ7/BM7mPaNSzFCISp920g8tISIy8+0b1xKef1Ktl5zA042D5pG92WX4E4OMHjPw0TCMaZ941K0TO288UefJr9pK9O+fikAfX/+K9XRJADuxhiTLv4QI08+R7mQZ+LHP0A00oDQdQq9O0g++QKh5mamfeNShu59GDObx0wf+LUHlcObCjxFeRWrUKQ8MAyANK3xgdIA1UTtA9rKHboLiBqhIL6JHdiVKqSzaIaOOxbF19mO22WS27gVM5XBFQ2hewRmOkt5MI5dLBGaMwOha5iZHELTcEXDuJti+DrbyW3cSml4EM3nRXf7MNNZqukspYEhfLEY4dnTsbJ5rGIJw+/D3VC7pkvmya7diF0uYwQDmJnaIPbywFBt+MqcGeg+L2Y6i+Zx43JH8La34A03kFu3iUp8BD0YQHe7x8o6jJnJ4W1vxdfZRn7TtoNc48o7hQo8RTnM+CdPYPIlHx4feD7ho+8nduxRADimRd/Ni2g+/hgmf/qj9L68mt7rbwUgMHUyXZ/56PjA8/YLzqTrfaeNn7f/tnvJB/TdBp4nZZYAQSJzZ9Ny1ims/d6PGPFYzP3CZ2jsmACAnS/Qe/09dH7wfTSddNz4wHOAyFFz6Lzo3PGB512XfgL31ImUzQrSttl+w+2Up7Yya5eB5zs1vvtYgtOnjD8TVJTXowJPUQ5BEonttkEIhrIJ/NKHJjRcY13642aOoiwiy2nsdK01milnycsiZnoIc9Vysk0+7IBgRCtTSPRT7h8knhtFdESQZp786hUkdmynLIt421qoBgSbVy1H2jbFSQ2MGibV9DBWoViblqzFT0mT9G/ewKgjSMgiZY9GsbWR4WqW3IsrKUyIUjVsRspZioO9VIYTxEcG8UxuZDAzzMiaEglZW3HCN6EdmyKlVcuxiiUKkxqI2wX0kUFyAwN40SlMjFH0QM+6l5CJFAlZxNfRhnAZDKSGMNbmSMgiRc3DUDaB28wjHYcRp4Atq3gLowTSfgBy5QIls0LWymN71OTR9UgFnqIcggQCvaojELSFm2sDzzUdz9gKBGVXiDw+Wr1RWiItAGS8/biEH9eONKWbHuCob38F99gcoNXRFBtveoCix+Lk7/w92edfYvCm+7FkhSZ8TP3gB5GOw7bf/gnN6+G4718JWq2TSaZnLX03PcC871xOPD5I/y13o+eqNOGDzgYWXvp5Uk8vJ37rwyz816vozyeI+iPQO0TPTQ9Qjemc8g//wMCiB0g++njtOCGY/ZlPk127iYGb7sfT1szx37oMgNzgIPG/Po6wBAuvvootK5eTu/sxjIpNEz5mfPITeFqa6L/tXpKPLKEJH6aj0xZuxuP3g5SktQA2Ou2BRgJj9eM1MsQCEcJ6EL3yJlfoVd6RVOApb46m0fHBc7ByeUaefh4rk9vvQ41QkNZzTsXb0fqmihDsnkznh89n8K4Hx9et2x+e5kZixx+NKxLe/XzTuuj88HkM/PVBpGnt9lrzae/G3RjDP7HjTZX5jauFz65DJ3Z+/erp2/ydbTSceCxGMIAQguRzKyn1D9H5kfNxsnGG7n0Uc7C2fJC7KcaE004n8/IGEND5kfPHekoKMqvWkN+0DT3gp/Mj55N4bCkjyThOuYIrFKL1nFMpDm9jYNGDeKMROi86F2Ho2JUqw089QUC46PzI+VSSO9hx272UenegeT20X3AmAPFHnkL3+ej8yPnoPi9CCEaWPEt+ZISWM04m7AkwsOgBUvEdaKaFr72VxpOOY3TpCqRp4m5qoP3c0xm675Hd6mTnEId91Y9a5bx+qYHnypsihCC24Cgic2dj+N7YjCS6z0vDwvm4X7X0zhvlaWkitnAewuV6Q8cZ4SDReXOIzpuz24eit7WZ2LHzEMaevw+GZ0+nYeF8vG0tb6rMb4SUEimd2iB/6eCMfy2RvLJtbLAARixC9NijEG4XjmOT37qd3MatRBfMJTizm/SLayjs6Eci0IIBogvmUuwfpDQwTPTYo4gcfQRSOhS27yC5fDXVdIbogrlk128mt3ELtmkifB6iC+biioVJrXoZ27KILJhbK0+1SmbtBkrDCaIL5uKf2Ely+SpKiRGE20V0wVyiC+aSWbsRq1QkeuxRhObMwHFsspu2kt/aS2jWNMLz5pB64UXyPX04joMRDRNdMJf8lm0kl69GD/oIHzULR8rx916ri51fS5xd6kcix+pSqtCrU6qFpxwQmtdDZP4RuPpi5NZseM19PS1NBKZ1YQT3f+HR/RFbOA+napJbu3F8OZt9icw/Ev+kjvElcfYgBA3HzcdMZ8lt2IrmNggfOQsjFDygZd43WZv9Skiqtolma+jSGQ9mU9qY2JiORdWqtWotx8aSzvi29AsvYZcryLAf7+xuBp98lpFMHM0y8bc34++aSFqWGXzyWVwT23BFQlQtE2nbpJatxHYbhE48GlwuBp98lnK5hPR7CU3vxhuKMfjksxSTo7QvPArXxDaqZoXkslWkcgl8M7vRXV4Gn3yWZHw7XukQmDoZX3srg08+C4B/7kxcEzuoWiZ2sUR61Rr01kY8TWESq16irHuoOhZaNERo+nS0QIjBJ5/FM2MK7mmTEE1RqraJiY0tBFXbRFgmSAdTOtjYmLY5Xj9V26RqV7FxkOIdMJ5SOeBU4CkHhBHw03Lau8lv7nndwPNPnkDnhecc0OsLIWg/93QAto2mXjfwWk4/aXyJmX2e77wzKO0YpDQwhBEMHvAyvzaBGFsA1mO48Rie3dbDc2sGHmHg1lzj2wzNwCV03JoLj+Em8/iy8V6aO5cHysgCLcJPw/SptJ13Br0vryZ5wz10f/kS/FNqqxg4VoXR+5+g/fwzaDxnIZnVa+i7aREG4O9soOO0kwnrXjb96BrM7gYmf+5SDJcbadkkH3iCnNui6wufGV8eKCPzhEWQlnlzCc+Zwfp/r/XSnPXP38AI136BKBfTJO99jK7PfRwZ8rP+t9djlmzcaATbWph4zhk4O2q9NKdf+WW8rU1IKTFTGTzCQJcaHt2N1+VBOg5uoeMIY3wbgEev1aMhdISjmnj1SAWeckAFuiYy+3vfopIYZes1N+wxM8m0r1+Ku6nhLS3DpE9+kGLvDnqu+/Nu2zW3i+7LPo0rEtrvCaG97S1M/cpn9nwOdAjLrdvEuh/8lK7PfRx3LMLQA48xeM9iADSPmxnfuIzCmto+pfYI8773LXSfl2JPH73/dxu6x82sf/oaicefqU0XZtWeY077+qUkUnG2/+kOop4gs7/3LYzRPoSuM/r08yQef4YZf38Z3ngfO/5yF0aqNlbRFQ0z++tfJ/HIU2xbupzZ3/sWALrfR3rFSwze+zCepgZmf+9b9N9xH+kt27BLZYThY8a3v8K2tS+y5dd/IDax9ndr589u8J7FpFe8dBBqWHmnUoGnHFDC0DGCAaTj0HD8MRR6tlMdSRFbUBsH5mqIvu6yPG+W7vPiaWmi4fhjkI5DavlqfBPa8U/qxB2LvKHVD4SuYwT8b2Fp96X2nEkiMW0L3TZxpI4mao/dLcfGlDaWtMdv2TnSxpYOhAMEZnST2dqD0DXyA0OYtkn0+Pnkq2mS6zYgqlX8R07HCblxvG7Sa9ZTHUkSOHIGaILRF9eQ6x+gnM/haWwgPH8O6S3bKFRyeLs68YUbcLxuqqUSw88sRxZK+OdMJfnSOjKZYcx0hoDHS3jebNLVFKOrX8bxGPhmdeN4a7eRk8tXYWayBI6cgTB0Rla9RH44jiltgvNmE/IFSb68lnI2jXd6F+72dhyvm9TKl7BLJWxdwztzCukXXsIRAtMx0cZuaVrSwZa73/I1HQvTNrGlXVt/Sak7qtOK8pYwQkE6PnAWwe7JaB43HReeTceFZ6P7vG/L9V2xCB0Xnk37BWciNI3wnBm0n38m2tt0/QNi7DNZE2JsiSAx/kcIgTa2HNDObSAQAvwdbXReeA6pZ19g6K6HKPcN4AoE6LzwHJpPOp7E4idxylU6LzyHxuOPRhOC5FPPU9i4jc4Lz6HtrFMZuvthSlu3oyHwT2in88JzSD79PLm1m2g8cSGtp5+MJgTmaJLhex8Gy6LjvDMZvv8xRp94FqdYwtfSTOeF5xCbdwSDdy3GHQnTfs57x8sbf+gJ7GyezgvPoeHY+Qzd9RBmfBR3KEjbWafQfu4ZDN/7KJWBYdrOPIWmExagCcHIo08zdNdDeBtitJ/1nrF6AA1trG40hKC2fZc6G39N9VipW6qFp7wlhBCg67iiEXwdraBpb+ttwZ3XF1Li62zDCAcR+jvp97uxLvWytt5d7Y+GMTbwXB9b200Xr2zThIaGBuUq5mACzXLQ0XA1RPA0N2EOJrBG0wTaWvHFohiagbBszMEE3kgYhMAcTGBXq+gIXNEoetCP2+/HHEzgb27C8ekYhgvNgcpwAlmqEOhox9ANzKEEugRN0/A3N+MJhWrny+YJdrbjCYbQEZQH4wD4W5ox3O6xciXR0XA3xdAaolgjaayiQ6CjDRGLYugGompRHU3ha2zAcLtrZY8n0YU2Xk+GbiCd2vsWQsMQ+i51pmNoY61ktR5eXVKBp7ylmk99F82nvuugXV8zDKZe/tmDdv2DobClh82/uHb8+9Yz3zPeaWXYbXHKP30bt9+PlJLywDCJsU4r0nF2O67pPSfQdNJxpFetYfMvrmXmdy4n55JoQqOaSrP5F9eSmxJj/pdqA8+3/PqPABgBHxM/+UHcuQqbf3Et8ZjOGf/4jwghMLO58WvM+udvkF23cbdrdlx4DjLkY/1vrmfUFMz5wVXECyk0t5vC5u27dVrZces9DN710NtTqcphQQWeohyCalOL1VZoiOdG8eGtTS2m1f7JJsw8BVmCcgYnXWsxZcs5crKEJmuL3Lad816MYIDtvZuw164meNHpBCpZ4qU0pZdfJP3iGkpeg7aLTmfjsqVUEqMUZQnNZdDx/rMZGBxg2++vwx0N47vodEbsIsmhYdLLV+PHIHjR6cjhXlbecCPmSJqqqNDxgXNwF1NseOgh/N4A/otOx1fNMpRJkF75EsXefkIX1XrTvrjoTiqJJGVZwhUJ0Xrme9iyajnVSgXX6cfh8QUZzo0yuG0zPas34vXXzrf2oQewK1UqQ3GqsgRAUbgZzo3gNgsgJSNOCVtW8BeS5NK14S+5SoGKVSVnFWrTtil1RwWeohyCalOL1Z43NQcb8Aa86JqGx1Xr8FE0Anjx0uwN0xKpDa9Ie4MYwosx9mi++6ijcTfF2LHtboqDGaZ96njco/20RptJlXopbhyAyc1MO+54tr6wEX1HEh9edN3DtIXHM3j3YkY39hOZF2XicccDYI+mKfTECXhDTLv4eHJLS+jPPIPXdhC6j2nHLsQ11Evq+U2Emz10HXc8It5La6QJO1VB7kgy7ZLaudYvXoYrmyOAF68/xrTjjqfnpW0UUznazz2SWGMzADlrK7mtQ0S6pzD5uOPZ9OQqKvERPICk9kzWlDotocba1GKOZFTzYSNoCcQIjNWPu+Ai5o8Q1P3oppparB6pwFOUQ5wY66Qixjqv7NymiZ2dVnY+m6x1XPG1NhOdfwSZF15CAp7GGIFJnbX9bIeRR5eiSWg761RGK1niDy3BSmVwN8RoOHYeAPGHn8QI+Gk761Q8bc1oQiPxxDMUcklix84j4vITf2gJub5NRB1JYMokQtO7STy2lEwpRezoI2lobUcTGk6pQvyhJbhCQZpOWEB88RIAZKWKf0IH4TkzkED8oSUEJk/AP3sqmtuFcCTxR5+mUkrTdNLxeDWD+ENLsPNFBILwETNwNzYwsuTZsXdfqwspnFonllfVj+CVekR10qxL76Sn+Iqi7Ad3UwONJx9HZs0GRp98Dk9TAw3vWoBdqWCXyow+uwKnatJ00nF421tJPPIUVqGIZ+y42ML5jD79PEYwUJvTcvZ07EqF1HOrKPTsIHLkTMKzppF45Cnym7aiuQwCXRNpeNcCks++QG7tRqLz5hCZNwe7UsHKFxh96jmMgJ/IvDkkHnmKxCNPIaXEN7GDxpOPIzRzKqNPPod/8gRix80HKWtlffp5qqNJYscehbe1icQjT+GYFprbTXDmNGIL5x3s6lbeQVQLT1EOM7m1G1n7/R8z44ov1lZL0DSqyTQb//t/ibtNTvr2lWRXvMzaq39EyikTw0PX5z4OjmTt93+M7vUw+1++VZtEGsi8vJ6+m//KjL+/jERikN7/uw1frgKAv2sSsz77OdLPvMD6//gls//lm4TyI7iCEfKbe+i57s8MRTVO//5VDN61mIG7Hhwv54wrv0xu/WbWfv/HeFubmfOvV4KmkRscYvPv/g+/JZj13W+yZdUKNv/iWlzlsQHwl38WT0sj/Xfez8Ci+9/+ClbesVTgKcrhyHFqQ0HGQgspwXGQjkRotRs70nYABwSIsVuBtX0chK4hxpYHQtbOJzQBQiAdB5xX7gnuPB9jx6Fp1EbN166JFGPlkHseJ2rHSem8UlYk0nGQtqiVQ4ix8+wcmDj2viS7nU9RXo8KPEU5zASnT6H9gjNxRWvLHg3d/xjFvn6mX/FF3IkdbPv9TTiZHAjwtrcx/RMfY3jxEoTQmH7FF0HUgi3+6NOkV76Mf/IEpl/xRXb85W4SqTh2oYgr1kTXZz9Kee0Ktvz6j0Rnz2DaNz6PMAysUpHe2x8iHAgz/YovIrdvZNNPf4uVyaEH/HR/6WIAtt90B77OdqZf8cXxlSn6b7+PQnKUSX/3ISKBMFt+9QeG86N4q1UC3VPouPAcBu9+CDOTo+H4o+n6/Cfp+f1NB62ulXcWFXiKcpjRPB68bS3kNmzBqVSRto0rEqYSH6WSTCGHEnijESJdE8EtqcRHMQJ+jFAQb1sL0rbJvrwep1LF09KI7nFTiY9SHk5g22UCU7sIRhqpxEexy1U8TQ14WprwtDSSXbOBQnKwdj6/j0p8lOpoCoYS+Dra8LY1U4mPAuCORfG2NOFta8Eul8m8uA6haxjhENVUmkrBpDyUQPoFoZnT8DY2UYmP4gqH0DxuvK3NuJtiB7m2lXcSFXiKcpgaWPTAHqslDMkCLfgIz5k+vlrC9htup/vLlxDongSAUzXZfuMdtF9wJm3vey/pVWvYfuPtALg7G2i/4EzCmodNP7qG4pQYEy79PC63B8ey6LtpESMei7m7rJYwLPNMJEDD8UcTPmIG6//tldUSXJEQAGYqy/Ybb6fr0k8QHBt4ninVxsr5Ojpqi9f27b5aAkA1lX6ba1V5J1OBpyiHICkkZtBCAOuHt+AveGtTZWm151w7ynFKMk0y10+iv7Yc03A2TsbJMNDzMpuvTeI5qhPhmszqLasxly8hL9MUDfCechyZ0ihrrvstpkcnfNZ8NtlJnJcGST23GjSB94yjGB3YhH3tKsxslpJM03jKCVSkReL+u2tL7Zw1n6pmsW54C6WefvKbtuF97xGUS3mWP3QfZAsUZZqi3yB60nxe6luHvWEl3rPmA7AuvZ3yuji5NZsQbgPPWfNZvuIpKukMuWoKl4Dm008il8/w9K1/xu314j5rPhsKg+j9o2Rf3kBhWx95maaEj3VDm3F5vUgp2eqkcGSV0ug2/P7aQPySWcGb8TBcHcEKmAfjx6ocZCrwFOUQJKTAlTcQCGa1TsUbHBt4btQGnvu9z5OnQGuok5aOGQDoyQJRLYIrX4ENcWZ+4MO4G2PsuOUuUlt7aSVCXLc54T1nk3n+RQafXUxlcjPH/t15ABS29LJ1Qxzd62HOpy9l8O7FjCzfRq13SIRZJ51BPD5I/y13E/OGmP6p89g4vI2pzZNIbk4R3zTKnIs/S+9wL6kb78ZI5YEIw16DE884n4E77yfbt5FZnz5v/H0mB8r0b4jjbWtm+sXn0XPtzaS2pYjgJeTyM+e0c9nywnKyKx+jsbubyae/cmz/0k0kt6ZoJcIgOrPbptUGnkuJpsWwKdPdOIVARxcAqUKGWCBCq7sJo+B6O36MyiFGBZ6ivIPsOgH3zq9fPSm3f0IHje9eyMgTz+KYJsWePoygn/bzzoDsMIN3L8YTjTDxYx8gbVgIIYg/8hRWvsDEj70f6TjsuOVuSn0DCMA3aQKN71pA/JGnycgKzaeeSFNDM0IIKvER+p94CW80QudF59J/5/0M5xNohSLBthaa3/MuzNEd7PjLXXhamuj4wNnj5Ry86yE0t4uJH3s/TrXKjr/cRXkojhEO0XDKsUQCEfpvv4+iYdJ+3umEm1oQQjB03yOY2Tye1iY6zj9zfK2/nXUhx3pz7qt+lPqlBp4ryiFJIjWQmqRiValYFSpWlbJZoWxWqDgWFWlTcczxbZZjYWJjB724Z0xmdOMmhlesopDN4AR9uGdMxjWxndFNm6nYJt65MzAmtFHIZUhu2UI2Ecc9YzJGVyfxlavJp5JYHgMaQrhnTCbV00M+EcfdPQHXtEkUchlKqRQj69ZTNiu4pk4kvvolUhs2YeIgo8Ha+VoaGFm7Dtvnwj1rCsV8lmIuy+iGjRSLBdwzJiNaYgyvWEWpVMQJ+XBNmYDRPYHR9Rso53J4Zk5Bn9BCIZdhdONmRtasw3LrGFM6qUgbWzhUrMp4XVSlTUXalO1d6mzsa1NaSE0NZ6hHqoWnKIcgKcD2WIBgODuCz6lNHr1zqZuEladACVnOYI1NHp0p58jKMtmeTQxedz2VbBwHm8ZjF2C0t/D8ddeT000mn3085ViUwUyckb7tbHnyBaLzZoOUPH/d9SAdSk6R2NFHEZo5lWxilB3XXU/DSfPRPTopq0hxxzYG7nmYXNjF5PNPJj4wzLbrb6Bs5Sl6BI2nHUtRajx/3fVkvZKpHzyVfCxMfmg7/Xc+AEDje44mNZpi+3XXI02TMiVa3n00TijA2rvvwe1oNJ5/ElXNIVHKYPduI/HEszQsPAq338f2jVspLXuGMiUKeBnKjuCqFkA6JGQRmyrefBJ/uraAb6FSpFQtk7ML2B41eXQ9UoGnKIcgIQVGqfYMb1JDx/gzPK9rbLV4d5SCSNPib6K1eSIAlaEEPhHAXapCfxbwgYAJk2bg75rIxjufxPJYzDriGFw+X+1Z11CSXH+W7gtmIG2Hbfc/N1aAIO0TptE4fyGZ1Wvoe/gFZs6cR87lIITAnzep9GdxPA3MnruAVHY5wwNrgACjhmDqjLm48xV6Hnges8FgzvzahNFWNkd5IAdIZs2cR3bdJgb6Xxy/Zlf3HGTIj/XI8/hNwZx5C4kXUoS9Qay0g9OfZfrHj8DT0kT/5jipoQKIIANSY3JDJ95AAOk4lLQQjqjQFWkjOFY/yXyaWCBCgxHBKKmPvnqkfuqKcpia+rXP4W6MMXTvI4w89Ryzr76SQGoAzeMm+cwKhh58nHJHlHlXX0nfTXdS7N0BgOZxM/Mfvkr80adZd/WPCc2ezuyrr0T3eihv2cTgogeIeoLMvvpKqsufYsP/+zVUqghdZ+Y/fpXeRB99f/kr0YZmZl99Je7RPgCG7n2E3LpNzP7eFSBg6//8kWoqC4CnpYnur3ya/lvvppjJ0P2FvyMaa0LoOvnN2xh68GliEycy++or6fn9TVRHUzhV1dNSeWNU4CnKYcbd1EB4znQKm3sobN2OEQ4SaZyD4feh5QxGn1kBpQqxY+eRdIqknl9NZSSJ7vXScPzRgCC14iV0r4fYwnl4O9ow/D6Sy14gnx4hOL2bgNtP6vnVFHr60AtFghM6CEyZRHrVGgrFUYLTphBurR3HiGTkyWVoLoPQnBmklq8GwMzk8TTFCE6fgtA0Us+vxtPSjDGxDT0YQPe4GX3qOcrZESJHzsLj9pJ6fjXVdAa7VCY4vRtXLELquZUHt8KVdwzVaUVRDjPethbazz+T5HMrGbr3YTyNDbSc9m4AHMtm+MHHcSoV2s8/g2D3ZIbufRgzmcbdFKP9/DNpOf0khu57BN3vo/38M4kdMxeA+CNPk1u/mYYTjiF27DyG7n2YzMvrwXEITu2i9ZxTGX7wCVLPrSJ27FE0HH9M7ZrFEoP3PIwRDtF08nEM3fswQ/c+jFOp4J80gfbzzyR69FyG7n2YQPckmk45Ac3tQjoOQ/c9SnlwmOb3noh/UidD9z6MnS8CEJk3m5bT331wKll5R1ItPEU5zNjFEsXeHXhamtADfoygH8e0KA8MURlN4OtsQ+g6xe39VBJJADytTbgiYYq9O3AsC/+kToxQEACrUKQ6ksTT2kRVt6nERyg7tY8OzePB19oBmqDUN4BvYjterYJmGNilMpX4CNVkmsCkTpyqSWnH4Hg5vZ1taG4Xxd4dmLk8vkmd6F4P0rQoDQxhCA++iR2YfjflgWFkojYlmbe9BeFy4VSqlPqH3+baVd7JVOApymGmsLWXLb/+IzO+/RU8TQ0AVEaSbPn1Hxl2W5zyT98m8/yLbPnVH0hTJiZ8dH7wXKTjsOXXf0Tzephz9ZXjqyDkN22j76Y7mfmdy8cHnvtztdlLfJ1tdF/6KVJPL2fbb//EnB9chTs3guEPUeztp+fam4nHdM74x39k4M77Gbr34fFydn3mY2TXbWTLr/+Ip62ZGVd8CYDswADb/++2WqeVH1zFlheWs/3G23FVaj0rJ37yQ3hbm9hx6z0M3r0YRdlfKvAU5TAT6J5E69mnjs9TGX/kKYp9A3Rfdgn66ADbb7gdK5kBAZ7WVro/dCHJZS+A0Oi+7JLa8j5CMPLkMjIvrcfb3kL3ZZcwsOhBEslh7GIJVzTGxE9cSGnzS/Rc+2dCUyYz5UsXI3Qdu1Rixz2PE/IG6b7sEuyBLWz93/+jOppE9/uY/OmPANB/+714WhrpvuwSNHdt5pPBexZTGB1lwofPI+wL0/P7m0lk4rirJv7Jk2g79zSGH3wMK18kMncWky6+aHyeT0V5PSrwFOWQJGtLyglJ1TLRbR3b0cZnDTEdm6q0qToWFavW2rIdCwsH2+fGmNhG1bLAMsknEhQTCdramtApkOnbgTAtMDSkz43e1kQxnUboOsbENpBQKZfIj46S7dsBkQB6WxPZgQHymVF0Xcf2uNDbmqDPTXZgAG9XJ0ZnK5VKuXbs0DB6g0NTWxNaboh0Ty9C03CFQ7XjgNzwMERDY9eUlEtFcsNxSuk0wcYoWihKunc7RbuIpntw/B70tibyIyNUE0mCC47A295EVdo4CKq2ibCq4DiY0sGWNlXbHK8f0zapWiaWtEENPK9LKvAU5RAlNYlA1GZQsS10TUOza7cZTelgIbGkjWnXVgK3pYM19se0LYYXP4GVL+BpaiDS0ULvovsYLqdxW1VCU7uIHDmLkewIvYvuwzuzG09TA6ZtIU2TwXsexhWL0nzBGVjFIr2L7qNSLKJFQsQWHEXIF6Z30X0UKNJ99nvwdbRhmlUG7lnMSDVL07FH4UGnd9F9xHNxQtIhOnc2/kmd9I6tUh49aSHetmZM28LK5Bh+5Em8XRPQtQkMPrGUjNSxbAujpYHGBcfgdgS9i+4jtGAumsuFa0IbpmNhIXGQWLaFZlvgOLXgR2I6r9SP5diYtokjHaRQgVePVC9NRTkkCYRd++N3+8b/BD1+gh4/ft2FXxj4dc/4NrfuxisMfOj4HA1n8w6sddsIRxpomjqV6sr1lNduwe/y0tDRSdvRRxFubcNat42mKd20zpmDz9HwWAJrfQ8hf5D2o+cRDseorlyPX3MRisZonjOL5qndWOu2QaFMx3HH0jBpEl5HYK/vwe4borF7CtHmVqor11PdvJ2gL0BTVxcts2ZirduKtW4rbfPm0jhlCj5Hw122sNZtI9bWRmPXZMz126iu2kDA6yfU2ETr3COItrRirdtG65zZTDjxeKLNrfgcHb8wMNDG6yfg8eMTBn5hEHB5x+vH5/IS8Phxay6ErT766pFq4SnKYSa3dhNrr/4xM676cm21hFvvof/2e4Far8qZ//BVsiteZt3VP6EyqYkFV18JQlDY0su23/0J3eth1r98k6F7H2Ht1T+mtloCTP/mF4gnBtn+f7cR9YaYc/WVGMPbELrG6JLnGH54CbO/dwWe4V623/RXXKk8AK5YhNnfvoLBvz5E/PGlzLn6ylpBhSC5bCUDd96Pt62ZOVdfSc8f/kJq4yZsx0K4/Mz+7jfZsnI5m39xHQ1TptSOHbutO3DXQ6SeX/V2V6/yDqYCT1EOR1KCEGM9LWXt+zE7e19K6SCR49/vPE7KXbbtchyagJ2rEezcRwhAIJHgSIQmas8ZpdyZk7tcs7YPQuyygoHc/ZpybJ9XXXNnOV5d1t3KpyivQwWeohxmgtO76fjA2fTfdi9mLoeVLeCKhpny+b/DndjO1t/cSGTmVGZcdRmjZq0V1nfzIhzLZsZVl+GYVTb95LeYudproVnTaD//TPpu/is5t2TCRy6guaUNgGJfP5v+9BCxI2Yy7RuXsuWXf2CgnMSbrRLpmsyED58PvevZ+KNraDzhGLq/cglAbZHWa/4PX0cbM666jGo6zYb//h/MdA53Y4yOD51NJBRl88+vJd8SpOvznyQUjQHQc+3NVJIpGk9cyJQvXsy23954EGpZeSdSgacohxm7XKbUP0glPoKVL+Cb1Im3tbm2LZsk3NaMt60FT3Mj2nCJ9MqX0XxedKDUP4hjmlQSo3g7WvE0N+IKhyj1D+JpbsD0aniaG9D9ftIrX6aSGCXWXlunrtQ/RCUxgkWJ4PQZ+JpaKPUPUk1niXS24WlrxhUOkVm1BgBPYwOa20WpfxAzk6WaSBLonoxoDFMZSVLOV/F1tGI1+mrjCS1Ir3wZIxJG83lxqiblofjBrWzlHUUFnqK8g8hdbuHt/Fq+6rZecXs/22+6c/y2YeMJC/B3TWDjD/+HuMdi1ne+XVstAagkRhm5+R66v3wJ0nHY+psbgNqiqbEFR9H47oVkVq8dH3juc0k0oWFmc/TdvIjilBidl36e1NPL6fvzXxFCoAc8tL3vvbhzFXquvZlkTOeYf/xHAKxcnr6bFwEw65+/QXbtRrbfdOf4NZvfeyIy5GP9b64nNzbwPF5IoblclHq303fzIqZf+WU8LY3033YvyVfNo7k/9aPULxV4inJIkjhGbSmeVDGLT6+iCw2XXhugnbFKFKjiqRYwChkAitUieVlFpzbubPLFH8EVDTPwzHIqzz5Dx+UX4+RHSZtFcitWMPLUc1Sbw0y//GI2L36MUv8QFlU0t5spX/gkiRdeYut//QT/5Ik0X34xOd0hsb2X+MNPEtS9dFx+Man1L7Lip79GFso4uk33Fy/GSg2z/rZFhMJRWi6/mEo+TrKQJvH4MxQ299B5+cUAvPiHG2oTQVPF3djApE9+kG2PPU05nyf60fcRDkdJlXMMbVjP1idXEGpuofnyi1l7+x1YmRxWoYQz9l4ruEgVM7ilCdIhKyvYVEmVc1TG6idbziGRFO0yjuG83T9Q5RCgAk9RDkkCzdIQCGL+MF7f7uvhZQwfhvAQdQdoDEYBiLv9INy4x/qDtHRMqI2te2Y1xYJF26QuKqMGDcEownFRSBXRwgHaJnVRqD6BljdBeNB0D22TupCrN2GlSoQnGbRN6gKgpA+TyZQJel20TepiYMdW9HQJzZYI3UPrxMmUXSALJiG/RtukLvJxaAhEKVdB5qq0TpwMQCpfxV20QXjwuP20TeqiZC5FL1o0tXXQ0NgCQBIXTrpEqEGnbVIX2aJNJVUaq6ZafRTQiPkj4+vhhYUHR0DMGyI4Vj8CiAUi+HUvmqWGJdQjFXiKcpjxTeyg6eTjSTy2FKdaxdfZTmTuHAAc06L/tnvxRCJM/OQHiSeH2f6nO6jER/G0NtFy+slI26bvz3/F29bMxE9+EFc0DMDAogfIVPO0nPZuIoaf7X+6g+RIL02OQ2jOTKLz5tB/270kzCztp5xArK0DACtfpG/xIrztLbS97730jd3CtItlAtO6aDjuaJxyhe1/uoPQrGmEYyE0txvHsum/7R6KhkX7BWfiqths/9MdmJkcALHjj8bb1sLgXx88CLWsvBOpX3MU5TCjez1421upJlOUB4fxNDcQnNFNeXiE6kiK8mAc6dj42lvR3G4yq9eiuVx4W5vxtrfiaW2mPBjHFY0QnX8E3rYWysMjlAbjmNk87sYYRjhIZvVaKsMjeJoa8LY142lrpjwUx0xnCUzrwjehvXbN0Vo5NJeBqyFKZvVaMqvX4oqE8La14G1vxRWLUB4cxtfZRmBqF9VMprYqw8AwTrmMp6UJYehkVq/FCPjxtDTWjm1pOtjVrbyDqBaeohxm8pu2senH1+yxWsKmH1+z22oJG390zfhqCRM//gGk47Dpx9fssVpCbv3m3VZL2LHragkTO5g21mllyy+uY84PrsKbG8HlD1Ps6aPn2psZ3mW1hF1XN+j+8iVk121k04+vqa2WcNVlQG21hJ7f3bTbagk91948vlrC5M9+/JXVElTrTnkDVOApyuFofGA3SMfZZSC6eOX72o67DSbfdVC4dCTjg9aFqA0I37XHoxg7dudAce2VayJfuWZtH+eVgeg7zy/HtgmBENruZdXGBs07DlLKV8awCwHSqe0n2G1QuqK8HhV4inKYCc2ZzsSPfQDN4wZqz96KPTuYc/WV+IZ62Pyz3yGLZRDgm9jJnM9/trbEjqa9MnWXEAzd9yjJZS8QnjODOVdfyZZf/5FEKgGmhbuplWlf+xyrlj/Fhv/6Fc3HH8Psf/kmwjCwCgW2/PEOorEm5lx9Jdrml1n3rz9BmiZGKDDektv8y+sITh+bLmysNbn9xjsoppJ0f+FiIqEo6//zVyQqWfyWIDhjOpP+7kNs++2NVEZTtJ17Ok0nHcemn/z2oNW18s6iAk9RDjNC09F9XkafWYFdKmMEA4TmTGf0mRVkssMYhRKBjjYC07pImgVGn1mBb1In7lgE3efFsSwSjz+D5jJofNcChKYx+swKzGwO3eMmcvSRRIJRRp9ZQSWbou24+QS6J6O53Yw88SyZfILQjKl4XbUyZBO9BMoVQjO68U1oZ/SZFQBEjpqNb0IHus+LlS8Qf/p5PK1NaE0RMi+tw9Y82KUSRjRI0+w5+L1+Rp9ZQXDmVIJS4utsGw91RdkfqtOKohzKRG3gtJS1W3uOdGrL20iJIyWSXbbxyve2ZZF4ahlDDz2OEYsQOeZIBu9/lJGnn8eRDr6uibSceTK+yZ0MPfQ4gelTiB47D9uysMsVhhc/gebz0nzmKbiaGxi8/1HsahUjGiL2rgWE581h6KHHKQ8naDr9JALTurBNk6FHlpB64SUi8+fgm9TB4P2Pklz+ImiC4JzpxI4/mqGHHmfooceJnXAMoTnTsC2LaibL0EOP45vUQWjuLEaWPs/Q4ieQgLu5gcb3nIBnQhtDDz1O5OgjaTnnVDwdrTi2jSNrt17lWD28Ui+1ZYN2rR9nbN5OxOtVvHI4Ui08RTkESSExAyZCCDaP9OIredGEhqHrAOyojlIkSy4/RGpoKwDDuREyMsfQptVs/Mk2Gk8+Dt3v4+WXV1F6+D4qZCm6BA0fPIWegWFe/slPsBoCdHzidLYbRaqrlxFfvATNpdP00VNZv2ETxZ88jlOpUCVHx/vfR7FUYMVtt+B1eWj4xOlQzbF5ZDv5tZvIrF5D04dOQmbTPH/nbZApUCVLMezBfv9prFu7juryp2j8xOkAbCkMU1i9jOSylRihAA2fOJ0Xnnmc4nCCUiWNW9OZ8LELyA4M8PR1v8fX0EDkE6ezrZpEDGVIPrOC3MYtVMhSEj42JXpx5bzgSHqc2oB2K9mHf6g2yLxYLePPjTJiJbH85sH5wSoHlQo8RTkECSlwFVwIBNObu/AGdx947vE0URB5WoLttLZPqx00kiUswrirVRgxmTF9Hp6mBvrWD5NO7QARYVizmHfEsWSKLzI4sp5KIMLR804AIF/twRgx0bwac446nsHeLKMjPYAGIsLM2ccQjw/Sv3QdMa/BjHknsGFoK1ObJ5PcmCA+ajPnyOPoHeoltXQjrqwAEWHI0Jl/1AkMbEmRzSSZddTx4x1jRvtyeEdMPIaLGfNOYNvzW0klB6ngI+TyM2fu8Wwxl5N7djMNsRBdY2UF2GGvJ5V0QEQYkBozWqaMDzx3tCiOqDClYRLB9ikAJPNpYoEIza5GjILrbftZKocOFXiKcpgJTJ1M2/tOY/Cuh7AKRaqjKYxwiMmXfBhjtJ/e628l1DWRqZd/lpRdm7Gk//b7cKpVpl7+WRzTYuv/XE81mQYgOH0KrWefSv8d95PVbTo+cDZNTbXVEkr9g2y9/QnCU7uY8qWL2fa7mxgoJfFk84QnTaDj/Wfh7NjMll//gchRc+j63MeA2m3a3j/+BU9TI1Mv/yxmNsfmX15HJTGKqyFK2/nvJRKMsu03N5KPeZh08UWEGhoB2P6nO6gm00TnH8HkT3+E3utvffsrWXlHUoGnKIckieNyEAgypRxV3UTTNErVMgBZu0xBVvCYRdw759I0SxRkFQOTgDCJD+7ATGcwggHc0RA5YVLWHKxCBrQO3I0hytkqw309jKZHEJrALUwcaTLc14Pu86I1+NGDbnLCJF3MUPIZhKJ+SiEX2b4ecslRqoU0sloETOJ9PaSdIpFQjHKodlxFc6jmMxhugR7xkt3RC0Aym8IX8qIJk6pZYnj7NlyhEFrES1F30DBJFTJUYw1UGwLkXZJ0Xw/JTBKzkEVYJbwEyMoKFeEmU8pSwkJKh5ysYssK6Uoec6x+cpVCbVUHp4x0qbk065EKPEU5REm9Nr6sZJYRlkATAtupDb4uOyYVbMp2lZJZC0HTrm1Lbe8le8MtWNksEofIcUfh7Wxj4w23kHE5zLrgPDzhECWzRG5wiMEHnqLl9JPAkWy84RaQEtMxaTr6aMJzZ1Pc3s/GG26h44KzcHsEpiHIjozQd/Misi0BZn3wLIobt7Hpptsw7SqO10Xs7JMxTIeNN9xCOqgx9yMXoAd95DNpem+otcg6PnA25f4hNt5wC9K2sbBpOOVYCAfove9hPLag86JzyTkVKtgUenoYuu9R2s89DSMcIvXcKgaefhYTGwuHklnBNjSk41DGwsGmZFXQxuqnYlYo6jqmtMbrVqkvKvAU5ZAk0Ms6AkFbuHmPZ3hlV4iC8NPijdIabQUg4+3HJXy4zSqkq2i+KEII2hrb8TW2Ua3qWNJm0qSpGC4XdrVKBTeeqk5HtBXHtKimazOo6IFGWhraibZOJJeqQFVn4sRuspoJpoXftihWdSxp0NncSWZrAtJVdH8Dhtehrakdd8HEqepUbcHk7hk4VRMznSFfrXW86WzqIDdSxElXQQj0QCMdTR3IgJeipeO3NSZ3TWc4O0JQ8+BoBcyqTkdDO+7GGLpch5G1QPgxpVarp7FneGnhxxE67YFGgmP1s/MZXlgPopX1g/AzVQ42FXiKcpiadvlncTc1sOOWu0k8tpTZ37+CwOgONI+b0aeXM3jPw1QmN3Ps969g6zU3UNy2HQDN62H2v3yTwXseZt2//oTIvCOY/f0rAChvXM/ArfcQ9QaZ/f0rKC99jA0//B80y0HoOrP+6Wv0DPXSd9NdxJpamP39K3DFa7cwh+57hOyajePnWv8fv8Aamwja09rE9G99kZ7r/kwxlWTK5z9JbGy1hPzmHgbvfpyG7inM/v4VbPrxb6jER97m2lQOByrwFOUw4+1oo+GEYxhduhxpWXiaGwnNnIoQAseyGbxnMW6vj84PvY9EdpSBO+6jOjKKu7mRppOPB8dhYNGDuCIhOj50Lp7GGEIIhu5/lGwpS+O7FxJx+Rm44z7SQ1tpsB2CM7oJHzGTwXseJlnN0HzCMcRaO2rXLJbov+N+3NEwLe89kYE77gNqqyX4p0wievSRSNNk4I77CE7tIhiZUxtQ7jgM3PUQJcq0nnUKbhsG7rgPM1sLyegxc/G0NDH8wGMHs7qVdxA18FxRDjPuhiiNJxxDbv1mkstW4gqHiM4/AgBpO6SWv4hjmjQcfwze9haSy1Zi5Qq4wkEaTziG6NFHknxuJZrXQ+MJxxCcXuvWn165hmLfAKHZ0wnO6Ca5bCWFnj5wHHwdbcQWziO1/EXyG7bUAnB2bbiEU6mQXPYCut9HeO4skstWkly2EmmaeFuaateYNoXkspV421sIz52NMAyk45B6bhXV0RSR+UfgaWkiuWwlTrkCQKB7EtGjjzg4lay8I6kWnqIcZsx0htTyFwlOn4LTNQF3Ywy7XCH78gby2WEic2chXC5SK16kMNZjMjijG09LE6nlLyIdZ7z1BFBNpils3U5oZje2UyG/aSuMLaBqhIJEpx2J0HUyq9YQnX8EFVlA97gxMznym7ZRSPbTumAudqlE5sV14+UMz52N7vfWArhaJbpgLkY4iFOukFm3HhsXkaOPwNEtsi9vgEQagNCcGeg+D1a+SOal9W9v5SrvaCrwFOUwU9oxyI5b7hpfHkhKSXU0xY5b7iLusZj1ndryQDv+ctf48kAtp52EdBy2/fbG8eWBdq6iUNzez45b7mLmdy5Hxgfpv+VuCmPLA3maG+n88Hmknl5O/+33MecHVyFzI+j+AOXeIXbcchepmM6Cf/gHBhY9QPLZF8bL2fH+M8mu3cSOW+6qLQ90xZeQUpIbHGTonofxW4I5V19FeeVyhu5+ZHx5oLb3nYanpZH+2+4l9fyqg1HFyjuUCjxFOcyEZk2j88PnYQQDAAze9RDF7f3M+u438A73svmX1yHzRRDg7exg1mcvof+2e0ETzPruN9i5ZNDwQ0+Qem4VwRndzPruN+j5w18YScVxShXcjS10X3YJlZXPsPFH19B49FHM/M7ltdUSikW2/ekuIpFGZn33G4gta1n/H7/ALlUwggGmfePzAPRcdzOBKZNq+4ytltB38yKKySRdl36CaDDCxh/9L/FSGl/FITZ9GhM+9n6233A71WSaljNOouGEz7Lll384WFWtvMOowFOUQ5JE6hIpJLlKActlo2saVas2B2TerlCQJn6rjK+UB6BsVShJk4KwKbkEdiYJjiRbylG2SuTNMmXLJJ9LozkS4TOoejTyZpmcVUIzDEouUZtkOTlCtpAlb5XBKpM3y2RyafKlAobbTclbO67qWNhWGa8w8bnASY5QyGYoVMvoO69pVzGzKTSXC5fXVzsnkLPKSEyCLoGUNs5ogly5QMku47Uq6GaZTDZF0Smjef0UXYyX1bTKBDQH6YK8NDFxkasUqGq1SaQL0sSWJrlqEWesfgrVErquU3aqSEMNPK9HKvAU5VAlQSDQhY4uNDSho421hHShoSPQhDa+TRMaGgJdCDRNY+Duh7GyOcJHziI6ewb9Ny9iuJolaFpEj5xN7PijiQ/20X/zIhqOm4+3rRVN05BVk74//5XQzKl0XXwR5cE4/TcvQhZKeBtitJx6ImF3gP6bF1EKCeZ85P14G6LoUrDjlrsZlSUmvudduE1J/82LGCmlaJCCxuOOJtA9mf6bFwHQcc5peJoa0DQNM5ml//Z7iS44ipBbZ+jBJ8hZIGyJf0IbE99zMnq+Qv/Ni2g95V212WMaojiWTW20Yq1ONE0DR6DVVodF03apH01D1/Taa1Itl1CPVOApbykzk8Mul/G0NI1PGPx2klJSGUpghALjt/jeGQTCEQgEfrcXr3v3gec+zYUjDPy6m6DHD4Bbd2EJA3fVQU/m8FRs9Kok4PbhcvuQ24dxKBJpnUiksZlQMELWM4KsSho6JuBubqSaTCFNC09V4nd5CQbD4CRIbx8m0tyEu9FPIBQmqPvIVyVul4emaVMRFRMzmcVTlRhC4vcFcJllUtuHcbQi0fYuIrEmvIEQuWptlpOmKVNACMxkDiddwFOVBDx+bEMgBxJQkUTb2ig0hgiGImhmjlJV0jhpMu7GGGY6S3kkgU8YGGj43T68Hj/ScfAJA0fYBAzveP1UzSoBtw+35kLYKvDqkQo85S2VeHwp6VVrmP29bx2U60vLYvMvrqXlzFNoOe3dB6UMb7fC5h42//R3451W+v7yV9KLHgBAc7uZetmnyTz/Ipt+9lsqk5tZeMWXgdoA7107rQzevZjNP/3d+Hm7Pv+J2moJt9xN0Rtixt9fhjO0FaHpJJ9/nvhDS5jzg6vQh3oZuPFuXKkCAK5IiOnf+gIDd95P4olnas/sdq6W8OwKBu64f7zTyrbf30RqwyYcbITLz7SvX8qWF5bT+4e/0NDdzfQrvjhenuHFS1SnFeUNUYGnvCWk4+BUqkir1rNu59gpzeNBaG/9b9fj1zet2veWhV0u165/EFqab5xEarX/l60KWKALHUmtdVRxTMrSouKY43NpWo5JVVpIzUFzuSgW81hZnXK1Nhel5nFjuyTFfK72XM1jYBkaJbOMUzEplYuYHh3NrVPIZSlVy5SlhdB1NJdBsZCnUilhuXWq7tpxVatKMZ+lbFUxPbXjKoUCVcfCEQ6a24Vj6BSyGSrSxvTolM0KCHDKVcpmpXZNQ1DIZihbVSrCwXIbmC6dYi5LxaxgunWqBmNlrYIjqWBjunXsSgVbGJStCtLUwXGoSAtbWpTtKvpY/ZStCmWzjOlYSE3NpVmPVOApb4nqSJLNv7wOx7TAcVj37z9HANOv+jLuaOQtv36pf4htv7kBKUHaNonHnib57AtM+8aluCLht/z6b55AOLVneF7Dg9fwoGs6HsMNgEe4MNHxaC68Ru02p6G5cAudxlnTmPiJC9nyqz/UblHaNsGGxtrUXYM99F1zI43HHMW8717BSDGN1/Cw7Y+3gxDM++6VOJUKG//7f3FMCw86kSNmM+Gj72fzL35P3qsx9ZMX0dzaiWYYmH2D9Nx/F83vOpa53/4aG374PwxW0wQtnYZp05j86Y+yftNL9Pz3b2g79zQmX3A2mqu2Ft2G/7mW0PQpzPvulVTiI2z9f/+LtGxCTc20feqDRMMxNvzwfyhPbmL2Vz9HMBRBGAabf30jlcQI7RecRecpJ7LxR9egSw2v4cZjeJCOg0cY2Fh4dPd4/XgMDx6XF0MYCOed8EuPcqCpwFNe08jTz6O5DMJHzMQI+F93/2oyTWrFi9iFYu038TGyaiKB0SefIzB1MuE5M96yMqdXvkx+Sy9O9ZVVraXtYJfKjCxZVptB5MRj9+v9HDpqH9DjrVPxyte7t1gF1ZEkI0uexcrmkKZFeM4MvB2tjCx5lnRxlPbjjyHUNQnd7caOF4kvf5LA5AkAjCx5FmnZOJUqwRnd+Cd1jm+PzjsC3SswAgGkaRJ/4hnK+QRd716IJjRGn3wOp1xB6BqNxx9NIBRlZMmz5AtxJp9yAv6JnSAl8YefBCB29JHj57byBaRpEZl/BKIhTHrVy9iGj6Z3LyQT1NG9Xsx0lvSqNYTnTAemY+XypJ5btcv7F7Wvd6kXsUv9iJ1/VNbVLTW1mLJX0nFwTJPRJ5eRfOYFzEy21lrb1/5S4pgWlfgI8cVLGF26fK/7jTy5jMxL63FM8zXP94bLK+XYOU1Sy1eTem7lnvvYNiNPLiO+eAlmOotjWUj5Trm1VSunlLJWZsl42ce3je1XHk4w/NATOKaJMAxCR8wgMm8OiceWkl71Mo0nH09gWheOaVJNpkg8thR/9yR8kzoZfugJ4o89XTtu5lSaTj0Rd1MDiceWEls4n+jRRyJ0DTOTI/HYUsrxURpPOg40jeFHliA0DcPnJbZwPr6JHSQeW0pu41ZazjgZb3sLVr5I4vGlJB5fSnTBUeh+H8MPPcHoMysQhkF0/hGE58wgtXw1I089R/N7301oznSQtfeVeGwp4SNn0nTqiVSTaRJLnt3tvUu586Yv41/vrJ9Xtr9NPzLlkKNaeMpe5dZvpu8vd9WevaUzbP3fG3A3xpj+zc/vdX8znWH7TYsoD8Zf99zpVWvIrtmIp6mBaV//3AEr88Yf/wa7WMapVl93363X/B+B7sl0XnQurnDogJXhwJHjz5nKVgVM0LVdnuHJnevhmbXXAcs2qUobh9pz0+lf+QzuhhgDf32QHY89ybTvXo47PYSpSRJPPsPQg49TndjI/O9eTs/1t1Ls3YHERnd7mPmdrzD80BJ6v///CB8xg+7vXo5t6GS3bGbwzgcIe/xM/e7l5J9/mhf/8+cI00Jogpn/+GVc8R1sufl2Io3NdH/3csRoPxWrwuA9D5Nbv5np/3w5AOt/9jvMTA6Jg7elie6vfIbtN91JKZOh43MfJdbQREVaJNdvoveBJ4lNmkT3dy9n8//+H9WRJNK2x+vDFoKyVUVatTk4K9LCxqZiVzDG6qcy9gzPGnvOqdQfFXjKXknbximVx76pTQDsVCr7PsCROJUKcj/CBtvGse3XPt/fwClXcMrl/du3Uq3dcj1kf91/ZViC1/Dgde0+LMGj1YYgeHUXPpcXAEN34RYGblH7MPd5A3j8QTyagS01/P4gnpIXr8uDRxi4bYmUorYdDdsRIAw0zcDvC+LVXLht8AgDvz8IgFszMGyJx6mdz+1yodsSTWoITcfvC+D2eHE5Ag967dz5Wvk96FRsgc9XGx7idgSa1EBoeDTXWDl0bEfg8/rx+1655q7l8EhBLdP1nXd6x57h1a6z8xmeI2y8ume8fkpGGa/Li0szEI66uVWPVOApuxlZsozUCy9h7yU4zHSWTT/7Pe3nn05wWm0GfWk7bP3NDdilEtXR9Bu6VjWVZtPPfk/H+88k0D35by5z5sV1xB99eq9lfi2lHYNsu/Zm3NEIkz/7sXdI783XF5jaRfv5pzOw6AGsfJGGhfNofs+JADhVk62/uZFw1ySmff3zDPRtZdPPfk91ZBTfhHY6LzoXx7TY8qs/EDlqDtO+cSm6zwdAz3V/JiOqdH7wfYR1L5t+9nvihSHabEns2Hk0nLCArb+5gTglplxwJg1tHQBYmRybb76W6NFHMvETF7L5F9eCBCtfJDx3Fi2nvRszk2PTz35P08nH0djehOP14JgWW39zA/mYh8mf/ghaqlArazINQMvpJ+Gb1EnvH/5yUOpZeedRgafsxsxmKQ8M7fU1aduUB4awS7sGi6Q8MLxftxH3OJ81dr7ym2vpWcXiPsv8WpxqlcpQAqf8xsv+1hu7pSmgWC0jzdpMIrZTu11ZckxK0qJkVylUigBU7SoVaVG0yuQyaSougePVsaIBzKCH7Np1pDPDuFyCklNFZNPk02mq/TvwdbYhW2LkMmmkZVP16NgRP05ThGq+QHrtOsqag6kJioUcwimS6t9B1WehT5mCGfCSy6apunVs3Y1sCFHRIbN2Hdn4AAGPQdEsY6aTpHbsAMDfNQE77CeXSWMVipheHTsWRPo9pLduoyJcVN0aVSHJ57LIZIpU/w78kzrQXS4qLoGdy1CSFhYGxWoJ2xDgSMpjwxKKVhkxVj9lq0KxWqLqmEj9UG3ZK28lFXiKcqgSIIXEdmxs20JqOtrYrThbOthILGljjYWgM7Yt19tH7vq/1FY8b4iCEJRTGbZc/xdSXslxX/sK2VVr2Hr9X8jKKmFcNJ11CtJx2HL9X9A9Hmb8/WUIXcNybLJbethx+71M/8bnESPDDNz1EJ5cBZC4Wpvp+MQHSD+3ip6b7mDGt7+CXkwh/F5yff1sv/F2kg1uZnz1qwzd9ygDDz3Ozg44bRe9j/zGrWy5/i+1Z3hf/hQIQX44Tt8d9+KtSmZ8+yuYL62m74770CsWIGm54Ew8zQ0M3L2Y9IOPARKHWj1Zjg1OrR5sJJbzSv3Yjo1lWzg4IFTg1SMVeIpySBIIu/YML+QN4PXu/gwvqHsQwkXI8BHx1TrdeA0vjnDhMTR0r5dIIIw7EMEul6k4OtFAhLJRwS91bEcn4OiYukHUHyWouZE4RAMRNK+bSCCMU631enU0N9FAhIA0KDgaPkfDjwsj5Kfs8eJ3dGzDhxmMEAmEydsVfI6GEC6igQhFlyTgaASkjjl2HEBQukD3Eg1E8AQiRAIR7GIJR+oEvQECLp2A1PE4GpbU8OgedJ+XgHDhdnTCbj/SH8YuFMmiEfIG8foCSMchIFw4wiHsDhAcqx/btgn7Qng1D8JSz/DqkQo8RTnMhGZNZ9KnLhr/fuCvD1Hs6WPWv3wTz8A2Nv/iWkTFBAG+iR3M+sqX2HrN/yE0nVn/8s3x44YffJzRp58nMm8Os/7lm2z4z18xkk4gAE9zO9Ov+jKlZx5j43//L60nn8DM73wNqN1i3vqnu4k1tdbOt3416//95yBrC8bWliCC9f/+C8Jzpu92ze03L6KYSjLl0k8QDTew9uofkTDzBHERnDOLyZ/5GJt+/Bsq8RE6P3wezaeeyIb/+tXbUKvK4UAFnrKbyLw5eFqbKe0YJPnMClrf916cqknquVVIx6Ht3NPwdbYf7GIqr0MIwdB9j2LlC3haGvFP7KD/1nuIF0fwmRahGd1E5s0hnk7Qf+s9RI6chae5ESEEjmkysOhB3A1ROj9yPnahSP+t92AXi7iiYRqPO5qYP0L/rfdQlFlmXng2gc52cBwGFj3IaCVFy4kL8WHQf+s9JDIDRCXEjpmLv2sC/bfeA9Q6nXjbWxBCYGayDD/4BMHpU/CKLuKPPEVBGkjbwdvWQvtxC/E4gv5b76HxxAUIl4tA18TxXpqKsj9Uu17ZjX9iJw0L5xOc1gVAZO5swnNmoHs96F4PDQvn154LKYe8zEvrSC1fjSsUIjijm9Ty1WTXbETaNt62ZmLHzsPb1kxq+Wp8ne2EZk0Dap2JUstXo3ncNCycjysaIbV8NU7VRA/4CR85i8C0LlLLV1NNpokefST+CR1IKUmteJH8pm0EpnXhba2dO7dxG0iJf/KE8UHlqeWrCc+ZUQstwC6WSS1fjbe1mcDUyWTXbiT9wksgJa5IiOj8I/A0N5JavprA1Ck0LJyPp7nxYFav8g6kWnjKXrljUWIL56N7PQhNI3zUbHD2MlhXCKILjsIuFMhv2vaqHpzgn9SJuzFGeuXLexyqeT1E5s7G9Sbn1vQ0NxFbOI/0ypfHJ6vetXyxBXMp9Q/tMSjeCAUJTJ38Dplbc/9VU2mSz60kOHMq0rQwszkyL64DQOga0flz0Vwuks+topxNEFs4HyMUwMzmyK3fjHQksYXzcCoVks+tpNg3ANRa/3mvILdhM5rhJ7ZwPoWgA0KjNDBMqa+f6DFzKZsZ8pu24cqUAGrBOW8+drFEds0GYgvn17a7XZTjIxR7+nCqJrGF8ykPxan0lpGWBUIjtmAuScMi8+I63Jas/Z301Z5jFrZtp9i74+2vYOUdSwWesle+Ce1M+Mj5QG1qptaz3rPX/YSm0fnBc6iOpuiN37JH4EXmH0F0/hF7DTxXODh+jTcjOHUyge5JZNdsxLZKu5fP0On40LnEFy/ZI/A8LY20n3f6YRd45f4h+m+7lxnf/gruxhg7/nJXrbUECJeL9gvOIrv8Jfpvv5fK5GZmffVLSCkpbOmh/7Z7a8sDff8KBu95mOEHnxg/b9v7TiOeGGTglruxfSFmXHUZhaGtCK0WgvHFS5hz9VVUh3oY/dM948sDGQEfnRedx8CiBxh9Zjmz/rm2PJCUksKqNQzceT/ethamfesL9Fx78yvLA7n9dHzwXEorlzN892M0dHfT9dmPjU8Vllr+oloeSHlDVOApr6s8MMzAogeQjsO0r+19KjBXLMKUL19CYUsv22+4DXdTA91fvqTWQjSM8fXwNv389wSmTKLj/WeBdmDvqM/4+8tASrb/6Q7scoUpl34CAKHrNJ92Eg3HHc2W/7keK5en+7JP421rRvO4D2gZDgWhWdOY8NEL6Lnuz1RTGZxKBVcswrSvfQ7vUC9bfnEtsaPmMPtfvsloOQtA7/W3gITZ3/sWdqXK+v/45fhA/vARM+m86Fy2/e4mcl6YePFFNLfWBpUXe/vY8Pu7aFxwFDP/4ats/O//ZaCSJlBxiE6dyqS/+xBiyxrW/dvPaTn9JKaffSpQ+yVq889+h3/KJGZ/71tURpKs+8FPccoV3E0NTPzEBUTCMTb88NcUO2NMu/yzBMNRALb+7/VUEklaz3oPje9aUBvIrij7QQWe8vqkBCEQrxFQQtMw/D7csQj+yRNwxSIYQf/4MTtXG/dP7MDb1nLAVx8XQoyvfuDraMOuVHa7hu5xI4TAP6kDK1/EFQmh+7wHtAwHlsR2OwgBw7kRfNKLJjTcem1pnYSZoyCLUMngpGst12w5R06WMAujWJvXM5gexszn8TQ14m4K0rN5PSO5JMQ84ANpF0gV0uT7B8noJgiBtXk90jQZyo3UfobhCJZXYm5eTzqkU/YZZKjgFJKUNg6RHInjb/BiV3PktmxgMJcgQwXR1k46qONsXk9qNI6r0YvhtqmWMxQ31W6vpoI6BbtAafN6zHSWofwIntZm9IYAdn8vmZEE2QYv1ZCLpFMim65Q2TBCygt2kw8nm8DVWyQhixTxMJwbwW0WkNJhRBaxZQVfIUkgXft7kKsUKFsVcnYBx2PvvdqVw5oKPOV1SWqzrOC8/mBdT2sTEz72foSu73Udlvbzz0Rzu96CUr6i6T0n7LWswmXQfsFZSMfBOCQnjN6dcARCgMdw4zHctcAzanXn1oyx9fAMPK5aK9XQDVzo2P0JEnc8iFYu40an7fhj8U3soPf6W0m5beZ/5hLcwQCay0CmciTufITOD58LjqT/jvtrHUWkoOWYecQWHEV+41aG7niQrs9+lKJboLvd6CWTxB0PUu0MM+dDHyC3ah0jix7C7Qg8Ph8Tzz4NV8li4I4HyUZ0Fl5ySW05onKZxB0PAtB16ccpbN1e+95xcKPT+Z53Q8hP7x33UrZgypcuJlUt4PX5sXoHSdzxIJMu/hDuhijxR54i+dQK3OjoCDyGG7fLDY7EjY6NjsdwjddPxaridXkw0NV6eHVKBZ7y+qTEKVeQe+u08iqay4WnqWGfr78dPTz3tfqBEOId1MNUoFm1gedRXxivb/eB52ndhyE8RFwBGgJRABL+EN72Tox0YWy+STcICJoCX9khWHYo2CaBsoMbE0kVd6ZIsOwQLEmkrO1Tu7yHkKURLEtkWRIsOwTKEseyESUbX8muna9kEihLZNmhUpaAm6p08FclnrFr5lwOwbJEVKpYhfL4NYIlB1F2KO1yzWBVIEsO3pJFwNIIlCXFsonHLuHKVmplLUvcZUmx7CAtgWfCZCyvRSwQwePzIx2HsPDgCIh5QgTH6gcJMX8En+5FmKqDej1SgacohwlPUwMTPvtxrI09bL/h9vHt8cVLxr92qibbfnMDutCQUpKmTEz46L/93j3ON/LEs4w88ez4971/+AtpKmgIwtRaTaX+Qbb++o/o4pUAsUoltv/pTgKi1ho1s/naYPdXtfh7rvvzHtcc/OuDmGNLHEnhZuuv/8iILBHEhVfUPq623/jKe3NFI0z7+qUQ70G43to7B8o7nwo8ZTelwTjVkeRu28xsHiMSQjP0g1Sq+iOFxAyYgGBTogd/yYsQGoZW+xn0V0YoygzZ/CDJwc0AZEo5suU8jpMhfcTeJweomEVSLh9ibMR20a7g6J79LlfZriIQmGPPEq1KkJQnOH4+gLJZAt1FWat9vJjVAkn3/j+ztaWD5dhUxq5RtMrYuouC2PPvn+7zIoe2kCykqdommtBASrY5GRxZwUz24Rscm3C7WmY4N0LcHB2rW6XeqMBTdpN+4SVGnly22zYj6AfJYdmj8VAlJLgKLoQQTG2ajDfoQdd0PEbtZ+ByN1IgR0uwjZbWbgC2J///9u48vq6yzuP45+zn3ixNmqQtLQOUtg5FUIF2ulEQAcfBFyDouDIjjktxHUVRoTCC4GvQ0RnRkVmclzNVBwZxVHiBpSp0ASy1tGBTbQtpmzbpln25uctZ549zc5v0Jjc3kE5Dzu/9Fz15znOec/PifnOe8yyHqK+cTnL2Ajhv0Yj1Huw8xJzaWdFmsiF0pLpoqBq9C/pEXQM9qIpKTTKayvHysf2c3XAGmno8jFq7j1KTqKbSjgYR7W07wLyGM8teFcXxXDJujmn5/fCO9XVQbVeRMEcJZkWlqa2Zs+rmoGsGhAG+Wo1PjrnTT6di5lkAdKd7qU1W06BPx0jLV18cyW9dDBcGEARYM+qpedPrAfCzWXobd0/izVKnonw6hKAqCqqioioKWn7Ua93iN1E97yzCIKDjqWeZ8ZYVKETlhobPiVRNQ1U1NE0nDENULfrvcqmqFl0jf46iaqiqjqZpw8sMqVfVon+Xu9+gGgZowZDzVQ1tjHaqioKiqORaDtO3u4kZlywDwK6bXvjM1PznoygKhDJoJY4k8GIo1EKCMCCXy5HJZLHsaATgUNaMemZcsRKAdMth+nc1lTVoJY6CICCbdci5zgTusxYW6spksqimSsI+Po2i9sLzAWh78hk6n91K3fJF0e4GjoNfYsXAQhnVj3ayd1z8cew8HzjR9AVfj84J3MH61CFlHALdwc+/cxu8ZtnX8FwC18FXc4Xzfc3BLxFSg9dItxym89mtnHPrp4umnYRAOp3B8WU/vLiSwIuhI8uieVufuPMOzp91Dv/0nduZVT/jFLfqtetQ21FW/c1q9octdKzsmpAFjX0r4PDKYwBc/a6PcOVlF7N69ScLozQHNVy2nLrli9h1z7c57PXTqVjYJf63PhIO0KckUFGBkO4wS7uSKLtdvflBK1X5QSsHwj5ySiXakJA9Fg5QpZgkid7BtYT95JTyp4G4+ORCn0olukZnmKFSMbBK3NfBsJ+0UsHMZYs59ys3j7iowcBAmrdf/SHaz+ikb1mq7PaIqUMCL24UCl/IfhDg+z7yt+6r5/s+Pv7ELcc++HsKwfd8Aj9gpF+UoqqgqoR+QBgEhASglHgSDwfLRKudMFb5E08nIESJzgPCMMjXMaRMOLwdYRgS4pfdpUmhzuPXKPu+wjCaAzpi20M8zycIQ1k2P6Yk8MQw9Zcso3bRG1Gt408S9qwGzvzQe+Qd3iSlGgYLPvtRzJ4jNFTUkDRGX0HG6jnK7OoGdFUjBDoGemgYnKdWhq5MHyoKNflNVek4yNzpcwqjRwESvW3UJKqoNKMnR7WrlbOnn172g6/ju2Q8h2lWNLKzOtVFtVVBwhh9NKnS2cKZNadhV03+BQXEqSOBF2O5GodOvZtnnt5K/bTpXPrmJRjVlRjVlcPKjTWZPK5c12Xjhi2093XRW9GHo56coe7Z+hyt6aNs2LiFWfUNLFn6pmE/V1QFe1YDlp7DqqrDtkbvorQMB7umAT0/aMVKadhV5W+zY6UMVEXFroh2uDBJYTfMQB/yVGWZPlayGtuOAstS09gNDWU/4SmeQ+jmsPOhavUq2IkqbLNEkCsDWHUNGHrxXLzu7l62Pd9IW38n6foMXtIr+37F1CKBF2P9c1Ps6tvLPXf/MxVmko1PP4Q6wQs6T2XpdJYv3nIvnubTtrgDP3ES1mdUoHthL5uPbGfXHXs593XzefCh70z8daaw/ftauOULf49re7Qv7ZTuzBiTwIu5UA1xKl00Pcdvt287Puy9tpaF8+ef4tZNPr/ftYv+VDTgYSCdwal08LWAUD253b2BEeBWunTTxzNbtxaONzTUUZ9/+h7IpTE0g6w7+qjLgVyG7nQvqqIBIX2ZFNoIE7pH05dNoaJE78GAtJuhO92Try+Syg2gEM2nA0g7WboGeih3NI/ruzieg+dHf0D0Z1MEYUDayY56zuB96aoOhLz00n78/Pkv72smV+ni277skB5zivXh+fJiJs7yv33FV5j99EyU/NDvyy9fzjf/cfUpbNjk9Nc33Exj4x4AAv34SErg5H6Z5n9PRkpn5taGwuGbPv4BPrYq2gbpQOchGqrqSJbo+jvYdYg5NbMKc/XaU100VL7yiecvHdvPvJEmnierqbTyE8/b8xPPy+T4Llk3R7V9fOL5tERV0QjVoZramjmz7nQMTScIAt58yftIpaL9+HLTHNov7IwKSuDFmgSeiASQbLMLE3LPqJzNpTOXnuJGTT5rD22gIxstvRaqIZkZ2f/XL1HVVbA7jgfaG6cv5A21CwHoC1IkFRtdGb3jpi9IUaUmUfLTEjJhlsQ4piVkwywKCpYShU930EONWp2vL5IKU1hYGPm1NHvCPmqU8jfZ9fHxQh8rPy0hHWYwFQO9RIdUd9DLNLUKlWiN0IeaH8MNoidM3/TJ1ZU/D1BMXRJ4YkSJNpu6nbWnuhmTTttFHTjTJs86jNX7KqlujgZ3uLaH5qiowegvqVzbQ89qKCiEhARGgOaW36XpG9GEdc2LznGSLkZaH7aWpmd7qK6K6kftcJMeelobVqaUUAkJtKBwDc/0UT0VtcSWPk7Sw8hoKGF0X4dXHiM05KtNDCfv8MSIvIRHas7AqW7GpOObk2u1GafaLfyefCNA9VSUEt/zgR6VKfxbC1H98h9RQy2EkMJ+cr7p4zjDAzPQg2gvv0KZAM0Zx0gRJQq9wfMDPUTxlZL35ZsBztBrnOR3quK1SZ7whBBCxIIM0BVCCBELEnhCCCFiQQJPCCFELEjgCSGEiAUJPCGEELEggSeEECIWJPCEEELEggSeEEKIWJDAE0IIEQsSeEIIIWJBAk8IIUQsSOAJIYSIBQk8IYQQsSCBJ4QQIhYk8IQQQsSCBJ4QQohYkMATQggRCxJ4QgghYkECTwghRCxI4AkhhIgFCTwhhBCxIIEnhBAiFvSJqkhTNXRVK7u8HwR4gVeiPhVN1VHGWYeqqOha6fMAQkI83ycIA0zNQFGiMxzPJSQ83g5FRRujPj8M8HwPVVExND3fNh8v8MdoxWCbFXTNQBlS16sVfQ4aCsqYn7UQQsTBhAXeLVet4ktXfbzs8j/b9gSf/OHtZN3ciD//5OUfZPXVn8LUzVHreOSFX/OJNatJO5nCsRtXvouvvfOL2IZV8voDuTSfe+CrPLz1cV68+wlOq5lBEAasuOd6dh/ZWyj34Uvfy13X3VyyvrU71nPTmtu45oIr+e4NXwXgwece4fMP3kPGzZZsB8Bbz7uUNR/9FqZusq5xIzetuY2edN+Y55XyvqXX8PV330qFlRzxcxJCiLiZsMAzVJ2EaZdd3tQMlBLPTbqmY5s2VonAM3WjqAZN1bFNi4RRui1BGKDln0htwyJh2gRhgKIM7+XVtei+SgWeqZsoKGiqVvgMDN1gzMfMQpvVwr2aull42nw1NFXDNmwSpj3i5ySEEHEzYYE3lOO5uL5bsowX+CQtG0VR8AOfnOdgaDqGZgBRVqRzmaLuPV3Voy/wIaGgq1rhSVBVFNK5LEEQjHBVBcswx+x61TUdc7AdikLayeCf0D05UjteKT/wC/ea83KEYViy/NDPaTSWbjLYNF3VSJgJsp5TdB9CCBEXJyXwvr/hAe779X/ilngXdfm5K3h69f9iGxYbd29h1X/dysfe/H5ufttHAHhg8y9Y8tVri+r4y8Vv5yvv+CwVVqJw7P3L3sGd130OgJ9vW8eKe64j5xUHbsK0+e4Nd3H5uStKtv+DK97F6ms+BcBPt/6SZXdfh3NCfe+48K3cdd3NVCcqS9ZVjg27n+PCr1wFKOTcHL2Z/pLlV112Q+FzGk3STBSeNq88byW/+dJ/8+Wf3MsTjRtfdXuFEOK16KQEXn9ugCM9bSWf8rJujlnTGkiaCeoqa1CAKruC2TUzAVAUlaM9bTgn1PHbpuf53pNrMHWDna17cH2PCitZOM/QdI72to/4bjBp2uRGeWc4VKV9vD5N1Tja00bOc4aV2dy0nfuf+iG2YbHrUFPRz8cj6+Y40tNWdvmhn1M5EobNzOoG7DG6eYUQYio7KYE3ESzdpDpRheMPD5Ldh5vY2boHiLoCXd/D8Rx60/35YwFVdiWmXtzllzAS6Nr4btnUDaoTVeS84UHZdGw/9z52f+GaY3Xhniw518l3gw4/buoGtmGhKAqu79KfTZV84hZCiKlu0gbee5dczVsWLh82RQDg0Rd+zTce/9dhox9/vm0dm5u2A3D56y/m0c/+AFMvvjVFUfmT2tPG1Y7rL/oLVsxfVNSOtTs2cO9j95PKDYyrvon2P1se5V+e+lFRl+u1F76VW65aRdJMsHHPFm7/6TfZ23bgFLVSCCFOvUkbePVV06mvml50vD87QGeqB8dz2N/ewvpdm+ka6KFroAeAFa9bzDmzzx5zlGa56iprqKusKTqecbK093eScbIc6Ghl/a7NE3K98epIdfPHQy8Xdf0umvsGgiAK6b5MipeP7mNApiUIIWJs0gZe1s2RcYrnsC2cPZ+78gNUHn3hNzz78vN4zvGuOsdz6BnoI2uM/K6u0kpGUwZeZTsWzJzL3137twA8sWMjm/duH/ZzUzOpSVbjeA7+kBGjhqaTMBNknMwr7mLMujm6B3oBCMOQmopqHM8l5zkjtlUIIcQkDrxHtv+KHzz9k6JQeNv5l/LpK24kYdpYhlk0v2xd40b2HN2HqhSvmmYbFndc8xmWzb+w7Has3bGef1v/QNET1BXnruAzV36ISrsi347hLbls4VJ+vOo+Pr7mNl46ur9w/M/OvoBPX3kj9z+5hk17fld2O4Z6eOvjbNn3IgArX7eYH6+6D03V+Pm2dXx/wwOvagCNEEJMVZMq8P5w6CUefO4RAJo7WplTO6toPl1dZW3R3LcFM+dy0Vnnj1m/pmps3f97mjtayHkOBzpbRyy3+8jeQjsOdh5mTu3MYU9pg+1Q1dGXIp1RXU991XSuX3QVze0theMXnXU+Fy9YxENbHi0cm107k4sXLEZVVA73HOO5pu1FATtUa9cRWruOALB03gUsmXcBlm7y4sE/lmyTEELE2aQKvLU71vPkH54B4Ka33MC33nt70WjLwQnfQ116zlK+/u4vj1l/2snyqR/dzm92PkMIoz4JPfnHZ3km//R14yXv5h/esxrLGL7ii6ZqJVeBybkOaSfD59/2UdQhAR0EAQNOZtggkzecfg73feBOTN1g3c5N7GjZhZMub9Rn1snSPdCLqRkEQUBtchq2YVFhJZmAOfFCCDFlTKrAc32v0IUZAkkrUTJUBhmaToWVHLOcgoLre2MO3vB8r7DCSxiGJK3EmGtznui3Tc/z7xse5J53foF5M84sHH9u7wt878kfsq25sXBMU7XCvdq6Na7VW9Y2bmBv20FUVeH1c/6Ub3/gTnRV4/Tpp2Hp42uzEEJMZZMq8Iba13aAJ3ZsxNBGXwbsheadeIHPgY5Wfvn7p8asM+c5tPV1jqsdze0tPNG4EbPE/L3nmxtxfY/WriOFdqzftZm1O9azfP5FzJtxRqHslr0vsq5xA32ZVOHYsb6Owr0+39yIO8IqMaPZ397C/nyX6dz6M7ji3BUkh6xCI4QQInJSAs82bGqS1SUnYxuaTm+6n5zrkMoWz2V77MWnWLdzU8lFj73Ax/FcfrVzExt2jz0tIAQcd3wDOn61cxPrd28eox0BjuewYfdzbG7aFh3zfRzf5Y6ffXNYl+Zg2aFeaN7Jjf9xc7Q9UBC84kEnWS9HT7qPnOdg6gZJMzEha30KIcRUcFIC78/PW8nptbMISiyCfLj7KLc+fC9+EHC0t71oNKYXeMOmG5TiBT6ec3IWRR5P3X7gkz6h7GjbHw07LwwmZDrBpj1b+PxAL7qqcck5S/ir5dePuytWCCGmqpMSeAtnL2Dh7AUlyzz8u8e56xffLmu/OFGepmPNNB1rBqKFst+35Foof8qhEEJMaRMWeDnPGfZeaiwZN1u0XNep0p8boC+TIggDgnCkbYVee1zfoz8b3VPGyU6ST1oIIU6dCQu8dY2bONrbXnb5/R2tk2Yx47sfuY+kmSAMQ46N4x4ms637XuS2n34DQ9M50Hmo6L2hEELEjWJ9eL788S+EEGLKk2U5hBBCxIIEnhBCiFiQwBNCCBELEnhCCCFiQQJPCCFELEjgCSGEiAUJPCGEELEggSeEECIWJPCEEELEggSeEEKIWJDAE0IIEQsSeEIIIWJBAk8IIUQsSOAJIYSIBQk8IYQQsSCBJ4QQIhYk8IQQQsTC/wHTQU5vs/rNZQAAAABJRU5ErkJggg==" id="imageef958413c0" transform="scale(1 -1)translate(0 -266.4)" x="33.2875" y="-10.445107" width="319.68" height="266.4"/>
In [5]:
action = env.action_space.sample()
In [6]:
from tqdm import tqdm_notebook
In [7]:
episode = 100000
state, info = env.reset() 

trajectories = []
win_or_loss = []

for _ in tqdm_notebook(range(episode)):
    
    trajectory = []
    
    while True:
        action = env.action_space.sample()  # agent policy that uses the observation and info
        next_state, reward, terminated, truncated, info = env.step(action)
        
        trajectory.append( (state, action, reward) )
        
        state = next_state
        
        if terminated or truncated:
            if reward > 0: 
                win_or_loss.append(1)
            else:
                win_or_loss.append(0)
                
            state, info = env.reset()
            trajectories.append(trajectory)
            
            break
env.close()
/Users/gaominquan/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:7: TqdmDeprecationWarning: This function will be removed in tqdm==5.0.0
Please use `tqdm.notebook.tqdm` instead of `tqdm.tqdm_notebook`
  import sys
  0%|          | 0/100000 [00:00<?, ?it/s]
In [8]:
import numpy as np
In [9]:
from collections import defaultdict
In [10]:
G = 0 
gamma = 0.9
value = defaultdict(list) # value[state]

for t in trajectories:
    visited = set()
    for (s, a, r) in t[::-1]:
        if s not in visited: # first-visit value
            G = r + gamma * G
            value[s].append(G)
            visited.add(s)

for s, g in value.items():
    value[s] = np.mean(g)

self-practive: adding a q-table¶

def policy: give action by q-table¶

In [11]:
value
Out[11]:
defaultdict(list,
            {(16, 2, False): -3.0571465895880943,
             (19, 6, False): -2.712807421761196,
             (19, 10, False): -3.0106397275241825,
             (15, 6, False): -2.8497013172028174,
             (12, 6, False): -2.8826780088044677,
             (20, 10, False): -2.7939530108285364,
             (20, 5, False): -2.747384048229748,
             (16, 3, True): -2.520428137717765,
             (8, 5, False): -2.7344629303641743,
             (18, 8, False): -2.7988676298652226,
             (18, 8, True): -2.4864167470962233,
             (21, 6, True): -2.0081198596918672,
             (18, 4, True): -2.298033051898793,
             (12, 10, False): -3.0794565074712175,
             (12, 4, False): -2.8335971595547673,
             (12, 10, True): -2.9659473446149476,
             (21, 10, True): -2.1598062720822275,
             (20, 3, False): -2.7250791256682714,
             (14, 10, False): -3.218776348660159,
             (15, 10, False): -3.1555030785917633,
             (19, 3, False): -2.8228947682253134,
             (18, 3, False): -3.0041828340144634,
             (12, 3, False): -2.873275096018703,
             (18, 1, False): -3.122025271976764,
             (17, 2, False): -2.9308603352816287,
             (15, 9, False): -3.0625646038673846,
             (21, 1, False): -2.729662793975098,
             (13, 1, False): -3.241650830423245,
             (12, 1, False): -3.336125765283581,
             (9, 1, False): -2.9847977286688874,
             (9, 3, False): -2.693210991172605,
             (21, 10, False): -2.595981564100574,
             (17, 10, False): -3.2069126000319077,
             (16, 6, False): -2.8982058665634547,
             (20, 6, False): -2.717402218981524,
             (13, 10, False): -3.1409366224363726,
             (18, 9, False): -2.9374168130256195,
             (21, 8, False): -2.3326729794397556,
             (17, 8, False): -3.105405062748975,
             (16, 8, False): -3.2325011224062403,
             (19, 7, False): -2.68547985785999,
             (18, 7, False): -2.6539157381065674,
             (11, 7, False): -2.6792344984380057,
             (13, 1, True): -2.915273421194813,
             (20, 4, False): -2.759136666068843,
             (17, 4, False): -3.09216650528318,
             (10, 4, False): -2.408528225881738,
             (21, 8, True): -2.061548940263884,
             (16, 8, True): -2.306722070569491,
             (14, 8, True): -2.8031576322330602,
             (18, 3, True): -2.8794847715768035,
             (17, 3, True): -2.68050710647052,
             (19, 8, True): -2.126428229307868,
             (16, 10, False): -3.1832818580143085,
             (11, 10, False): -2.91939801888793,
             (15, 5, False): -3.0172303306490496,
             (14, 5, False): -2.842737686027602,
             (17, 5, True): -2.563733911519275,
             (9, 10, False): -3.006723798575006,
             (15, 1, False): -3.320963453856445,
             (20, 1, True): -2.8331053950928027,
             (20, 9, False): -2.720083416225954,
             (11, 6, False): -2.537781233470525,
             (13, 5, False): -2.8417823993249636,
             (15, 7, False): -3.09717618247024,
             (18, 10, False): -3.0763250327119893,
             (9, 6, False): -2.678639114504774,
             (14, 8, False): -2.942609187715718,
             (13, 8, False): -2.9624341977675144,
             (21, 7, False): -2.6950750508918286,
             (12, 7, False): -2.894412028041224,
             (18, 4, False): -2.8736763385657422,
             (15, 4, False): -2.902235841923674,
             (12, 2, False): -2.868616195623174,
             (7, 10, False): -3.037238495944908,
             (16, 3, False): -2.997145536671064,
             (19, 1, False): -3.167680662165189,
             (17, 5, False): -2.9649061453819368,
             (17, 10, True): -3.053176875415088,
             (20, 2, False): -2.753185990421281,
             (11, 9, False): -2.9323782692443223,
             (16, 1, False): -3.4581080626416107,
             (15, 1, True): -2.9306699412849366,
             (8, 3, False): -2.8885362862577613,
             (11, 5, False): -2.463129733559265,
             (5, 4, False): -2.8405122480419775,
             (17, 6, False): -2.895583666006279,
             (8, 10, False): -2.9931421781725143,
             (15, 8, False): -3.0535242071244273,
             (19, 9, False): -2.769178171191991,
             (19, 5, False): -2.733282908750978,
             (15, 3, False): -2.9922878654324356,
             (13, 6, False): -3.0251739073049775,
             (21, 1, True): -2.4752878328228918,
             (16, 1, True): -2.850976600648673,
             (17, 9, False): -3.20431957527415,
             (12, 9, False): -3.1281254570258277,
             (9, 9, False): -2.962150676027894,
             (7, 6, False): -2.6577932717014106,
             (19, 2, False): -2.967720156150463,
             (12, 5, False): -2.7956353794172157,
             (20, 7, False): -2.7187776391230902,
             (16, 7, False): -3.0446288056878275,
             (12, 8, False): -2.9168010824341137,
             (16, 10, True): -2.9636013398589562,
             (13, 10, True): -2.9590645067298094,
             (14, 6, False): -2.8082766705591093,
             (6, 6, False): -2.663061160854569,
             (10, 5, False): -2.5825761973342707,
             (21, 6, False): -2.672963383671972,
             (19, 8, False): -2.6095971152614665,
             (21, 2, False): -2.824831702964427,
             (10, 6, False): -2.608137661342639,
             (21, 9, False): -2.5934898541320495,
             (21, 9, True): -2.0758707062753556,
             (10, 10, False): -2.9283469182948663,
             (4, 9, False): -2.9344715528305314,
             (17, 1, False): -3.280447250194101,
             (7, 1, False): -3.1280366993028412,
             (19, 4, False): -2.7508335396563206,
             (7, 9, False): -2.8285874891145584,
             (18, 5, False): -2.9165398527271855,
             (6, 5, False): -2.404568827731161,
             (13, 2, False): -2.7566196163810974,
             (14, 7, False): -2.9794707754056278,
             (19, 10, True): -2.557884345729407,
             (15, 3, True): -2.7440747713617704,
             (15, 2, False): -3.0625958382277765,
             (4, 10, False): -3.0694719308818748,
             (13, 3, False): -2.935091427832379,
             (17, 6, True): -2.3890030493770684,
             (12, 8, True): -2.428341522457083,
             (20, 4, True): -2.495769580288626,
             (16, 4, True): -2.4958525645383074,
             (20, 1, False): -2.946500130211952,
             (15, 10, True): -2.966173824839517,
             (15, 5, True): -2.5002396595278698,
             (16, 9, False): -3.0647205587691135,
             (14, 2, True): -2.8689321838288144,
             (9, 8, False): -2.7409605911031294,
             (19, 4, True): -2.369459477174782,
             (13, 9, False): -3.109229123406068,
             (11, 1, False): -3.1489685019383815,
             (8, 1, False): -3.1238187567280313,
             (13, 6, True): -2.9051751174587155,
             (14, 1, False): -3.2910112252830626,
             (18, 1, True): -2.7558435792269225,
             (10, 1, False): -2.9786410906586394,
             (18, 6, False): -2.775455778705844,
             (12, 7, True): -2.899104998061137,
             (13, 4, False): -2.873348753511588,
             (16, 5, False): -2.9450226116349754,
             (17, 3, False): -3.154413945940009,
             (10, 8, False): -2.8695605999757547,
             (14, 3, False): -2.967248767611303,
             (21, 5, False): -2.745365626920548,
             (16, 5, True): -2.749276278971905,
             (10, 7, False): -2.8626538457627713,
             (18, 2, False): -2.911298951477898,
             (13, 7, False): -2.989795505062506,
             (4, 7, False): -2.3559004451228462,
             (9, 4, False): -2.596672948393004,
             (15, 8, True): -2.622698899637698,
             (21, 2, True): -2.119458472911723,
             (14, 10, True): -2.9259255518318743,
             (21, 7, True): -2.012292552982701,
             (14, 4, False): -2.9394420577451035,
             (18, 10, True): -2.7213027441734012,
             (9, 2, False): -2.610675620784762,
             (20, 8, True): -2.0776011942422046,
             (5, 10, False): -2.866798366478192,
             (16, 6, True): -2.313358891894608,
             (8, 7, False): -2.69019596865566,
             (11, 2, False): -2.8002530999189506,
             (8, 6, False): -2.62194148012501,
             (8, 2, False): -2.9144521273720856,
             (15, 6, True): -2.8486565492644367,
             (5, 1, False): -3.119903552028464,
             (21, 3, False): -2.77011831191982,
             (20, 8, False): -2.655702780431253,
             (18, 2, True): -2.3564146699351776,
             (18, 9, True): -2.7379626355940747,
             (8, 4, False): -2.564437245489418,
             (6, 10, False): -2.9214692638389903,
             (9, 7, False): -2.8995714982496152,
             (7, 5, False): -2.8645058594039274,
             (19, 2, True): -2.5086523370550053,
             (14, 9, False): -3.0734201590530525,
             (17, 7, False): -3.075494444496002,
             (10, 2, False): -2.6817638425105472,
             (7, 8, False): -2.694309436674927,
             (18, 5, True): -2.2463336393320015,
             (8, 8, False): -2.889083864206789,
             (17, 2, True): -2.487858484385865,
             (20, 10, True): -2.4449961431330425,
             (7, 3, False): -2.9259053576427356,
             (19, 3, True): -2.4895849083830788,
             (20, 9, True): -2.169767992886328,
             (6, 7, False): -2.7989056632619396,
             (6, 9, False): -3.0803501260789488,
             (10, 9, False): -2.9281968943338734,
             (14, 5, True): -2.472024515163564,
             (21, 5, True): -2.1431148758420973,
             (11, 8, False): -2.784689321805663,
             (12, 9, True): -2.5638449707483075,
             (7, 4, False): -2.565392259206351,
             (13, 9, True): -2.560451481026416,
             (6, 3, False): -2.7854533916076787,
             (11, 4, False): -2.6565610584963277,
             (4, 2, False): -2.70453053091599,
             (21, 4, False): -2.654766497550774,
             (14, 2, False): -2.99772036443487,
             (5, 3, False): -2.7061805997045005,
             (7, 2, False): -2.7208250785783186,
             (16, 4, False): -3.0703833518703507,
             (21, 4, True): -2.1609041504569744,
             (9, 5, False): -2.6455736422520535,
             (12, 6, True): -2.7318158380927424,
             (5, 8, False): -2.7165712117720853,
             (11, 3, False): -2.67157687525728,
             (17, 4, True): -2.511774484916293,
             (14, 6, True): -2.015643706946555,
             (15, 7, True): -2.8348782226650457,
             (18, 7, True): -2.5238548530565033,
             (20, 5, True): -2.14050601158983,
             (17, 1, True): -3.057736627080487,
             (19, 1, True): -2.8664088057755643,
             (14, 3, True): -2.596063910856235,
             (17, 9, True): -3.1717588778094687,
             (19, 7, True): -2.4471234082497126,
             (5, 6, False): -2.5600692153863913,
             (20, 6, True): -2.0628912144489715,
             (18, 6, True): -2.20824832012878,
             (6, 8, False): -2.954102524147391,
             (8, 9, False): -2.935060112804951,
             (20, 3, True): -2.020273006016064,
             (13, 8, True): -3.0629995328850494,
             (19, 9, True): -2.242232074500792,
             (13, 5, True): -2.3956958721468546,
             (20, 7, True): -2.2448823512418064,
             (4, 3, False): -2.5716666912991593,
             (19, 5, True): -1.9897226049480146,
             (10, 3, False): -2.654907132125508,
             (13, 2, True): -2.6037219692614735,
             (12, 2, True): -2.3637076279425346,
             (21, 3, True): -2.0724610809041484,
             (12, 4, True): -2.390399637782483,
             (20, 2, True): -2.4609606912714916,
             (15, 9, True): -3.1507631463759216,
             (5, 2, False): -2.653851253045832,
             (4, 6, False): -2.620972891440496,
             (14, 9, True): -2.8790337916342006,
             (6, 1, False): -3.2689065955729313,
             (13, 7, True): -3.0155358463979596,
             (14, 1, True): -3.1818083991222834,
             (12, 5, True): -2.5643089748600882,
             (5, 5, False): -2.787883718786022,
             (14, 7, True): -2.8128544618228215,
             (6, 4, False): -2.758318577672533,
             (16, 2, True): -2.6257684021294345,
             (4, 4, False): -2.5114219847654633,
             (19, 6, True): -2.4406129961125003,
             (16, 7, True): -2.6848625846347356,
             (13, 4, True): -2.463861453388893,
             (5, 9, False): -2.66535005633257,
             (7, 7, False): -2.7490818026930954,
             (16, 9, True): -2.6894564904996794,
             (17, 8, True): -2.753688251776556,
             (17, 7, True): -2.6167108844787674,
             (6, 2, False): -2.872940812532355,
             (5, 7, False): -3.0166356794024685,
             (13, 3, True): -2.7341331180185935,
             (4, 5, False): -2.4009628930816684,
             (4, 1, False): -3.196772661668871,
             (14, 4, True): -2.653367599259138,
             (15, 2, True): -2.3241076978168054,
             (15, 4, True): -2.498952922491455,
             (12, 3, True): -2.410392298194063,
             (12, 1, True): -3.390914277161906,
             (4, 8, False): -2.8652427514007823})
In [12]:
value_with_ace = np.zeros((20+10, 11))
value_with_ace[:] = np.nan
value_without_ace = np.zeros((20+10, 11))
value_without_ace[:] = np.nan
In [13]:
for (p, d, a), v in value.items():
    if a: value_with_ace[p][d] = v
    else:
        value_without_ace[p][d] = v
In [14]:
s_value = plt.imshow(value_with_ace)
plt.xlabel('Dealer Show')
plt.ylabel('Player Sum')
plt.colorbar(s_value)
Out[14]:
<matplotlib.colorbar.Colorbar at 0x7f79d0e3f3c8>
<rdf:RDF xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:cc="http://creativecommons.org/ns#" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <cc:Work> <dc:type rdf:resource="http://purl.org/dc/dcmitype/StillImage"/> <dc:date>2023-05-13T16:10:46.325837</dc:date> <dc:format>image/svg+xml</dc:format> <dc:creator> <cc:Agent> <dc:title>Matplotlib v3.5.3, https://matplotlib.org/</dc:title> </cc:Agent> </dc:creator> </cc:Work> </rdf:RDF> <style type="text/css">*{stroke-linejoin: round; stroke-linecap: butt}</style> <image xlink:href="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAIgAAAFyCAYAAAAuxJKZAAAEOElEQVR4nO3dvWuddRjG8Ts5bWyito2pGCFoqX2hWltD6yBWsPi6KA5CQVARxFVdXZy6+QKCg5ODDiougi6KghRRpIMWEwqKdFBraLFtmqRN2jT+B5fcbuLnM1+HH5x88ywPz3OqAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAPgHQ90P3PXpK2vdz7x1xwet/Y2Di6397MpNrX1V1Yahy639odGF9hmra72vam51pbW/3P/ztb+r4fYJ/K8IhEggRAIhEgiRQIgEQiQQIoEQCYRIIETruh9YObqlfcjLw4db+6/2vdfaPznzWGtfVfXano9b+z+uLLfP+Gh+urV//PofW/vZlcnWvqrqjV8fbO1dQYgEQiQQIoEQCYRIIEQCIRIIkUCIBEIkEKL2vZjh3qMbVVW1tDzS2t977LnWfufE6da+qurhsd5zMQePP9M+o/sA0T07f27tf1nuPw8099fG1t4VhEggRAIhEgiRQIgEQiQQIoEQCYRIIEQCIRIIUfstaNuPvN5+id3+Qyda+9kPd7f29z/7fWtfVXXzyPnWfnzdYvuMN396oLXfMNK7gbg4O97aV1WtNS8JriBEAiESCJFAiARCJBAigRAJhEggRAIhEghR+8Gpye9W24fM7Om9bG1h29XW/ouTu1r7qqrlS72HuW6fOtU+49K5Da391bnrWvttn/R/5Oji5Ghr7wpCJBAigRAJhEggRAIhEgiRQIgEQiQQIoEQte/FnN3V/khdaN5jGJ3q3WMYDHr3bqqqBid790mmdp5rnzGzsLW1f+TRY639Z9fub+2rqiZ+6D0K5QpCJBAigRAJhEggRAIhEgiRQIgEQiQQIoEQCYSo/RK76Rf6L7Gbfv54a//1l3tb+2t2915I9698s7n/mea/38L23kvsdrzb//mv8zvGWntXECKBEAmESCBEAiESCJFAiARCJBAigRAJhKh9L+ahA6+278Wcum9Taz+/b7m1H1rqP8w1fsvZ1v7o9PvtM+5++6XWvvtjPyMXevuqqnUXe38+VxAigRAJhEggRAIhEgiRQIgEQiQQIoEQCYSofRPj9IGN7UPm7+w9v7H+z96P/Wzee6a1r6raMrbY2h888mL7jM1nej++tH6p9zK+35660tpXVd3wee/lfa4gRAIhEgiRQIgEQiQQIoEQCYRIIEQCIRIIkUCI2jfrtrzzbfuQ256eaO3Pbx1t7dcPejfFqqpO/D7Z2g/f2j6i1oYGrf2mJ+Z6B5we7+2raqj52JsrCJFAiARCJBAigRAJhEggRAIhEgiRQIgEAgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAwH/e33Zej/LOzi65AAAAAElFTkSuQmCC" id="image78f7a6b1f7" transform="scale(1 -1)translate(0 -266.4)" x="40.603125" y="-8.759353" width="97.92" height="266.4"/> <image xlink:href="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAABIAAAFyCAYAAAD4TYq5AAAB8UlEQVR4nO2d2w0CMQwEnUtKowT6L4VQgvkYodHKLmCVfdhJjtOxXut9C6iHAKmqOrUYrLOexQBRK+I0CqZWHDUGiHQNSraPWtmoYTkC7acGm48a59oku6tg17KTPTttU0rXdNRsrp0r1IgC8lHLTTaVI8z+YNemafsyJpvBMSb75jbthVwTJpsKZPI8ElLjXGOAspuWARJeIYz2Q0BGauNaBxS80wbPI+roZ6Q2gWyBdN0fPNiGWl/Rlxqba8KZrdNoqPU11P4JFKwR8jZcKan5Rm3wLqKjNhr1JWza5FGbS200agtb0cyjviaQPwBF28+ca5Pt97kWHUgMKDXZPrGVyWbs94ktdM3XtD6NJtk9EPSziPF85Ask17QVq9HKtT+aGgSEdT8ltlAjHzVwO7JRg5gpqXG9huDUeZI18lGDgIJde6CDVrBrwmQPtR+AhMmmqAW7ZgMCNaLutEJqwa7lziPQtQ8FpNMouWkh17aPGucao5Gwafdo1AIZkx07ancwNWyDpJqWWpFxOxIetIZaB0R1v3EXoQabsGmD5xHnmq1pwZ3W5hrYa9i9X6dR8MzOnUfcfU3nGhjI2Kc13GATniFljzSMTat7IwqcR0ydLdQI+o4WtSRyZmPUbB8uAV3TUeNcg/6qwKgRRk32Gf4vtfqN5qlVmUMAAAAASUVORK5CYII=" id="imagea4086aa523" transform="scale(1 -1)translate(0 -266.4)" x="156.24" y="-8.64" width="12.96" height="266.4"/>
In [15]:
s_value = plt.imshow(value_without_ace)
plt.xlabel('Dealer Show')
plt.ylabel('Player Sum')
plt.colorbar(s_value)
Out[15]:
<matplotlib.colorbar.Colorbar at 0x7f79b34d6d68>
<rdf:RDF xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:cc="http://creativecommons.org/ns#" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <cc:Work> <dc:type rdf:resource="http://purl.org/dc/dcmitype/StillImage"/> <dc:date>2023-05-13T16:10:51.973033</dc:date> <dc:format>image/svg+xml</dc:format> <dc:creator> <cc:Agent> <dc:title>Matplotlib v3.5.3, https://matplotlib.org/</dc:title> </cc:Agent> </dc:creator> </cc:Work> </rdf:RDF> <style type="text/css">*{stroke-linejoin: round; stroke-linecap: butt}</style> <image xlink:href="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAIgAAAFyCAYAAAAuxJKZAAAGdElEQVR4nO3dT49fdRnG4Wdm2mmn05lSEFoUSwOFQMEudKExKAlLjUsT34JsCHtXJi5caIxb3wNRF5q4kRiiJMb4D4Mp0Ag4hVbaTqct03ZmWt/Bbe4GE02ua/38+p0/nzmLPjnnzAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADwHyy0Hzjz8+/cbT+zeelwNb+8eruaf+LYv6r5mZl3rxyt5o+vX6vP2N7ZX82/duaVav7FjS9V8/di8b9+Av/XBEIkECKBEAmESCBEAiESCJFAiARCJBCiehdz6ns/qHcx+57equYP7N+t5rdvLVfzMzNLS3eq+ZXlnfqM1rHD3b7ng2tr9RlXt1areVcQIoEQCYRIIEQCIRIIkUCIBEIkECKBEAmEaF/7gQf/2O0wZmYu7Ryp5q882t0Xc+jIdjU/M7O02H0fCwv1CmoO7e/2N8cOdruYty48WM3PzKz8eaWadwUhEgiRQIgEQiQQIoEQCYRIIEQCIRIIkUCIBEJUL+t2DvVN7RzuFl2rZ7sboT7+zFI1PzOztN19H9snbtRnPHT8ejX/6ltPVPNL7x+s5mdmlq92vwtXECKBEAmESCBEAiESCJFAiARCJBAigRAJhKjexRz9e7dfmJm5crp7odCRc91NTQc/6ju/8Uj3/L7b5w/VZ5x/9WQ1f/+t7t9f/bB72N+9cAUhEgiRQIgEQiQQIoEQCYRIIEQCIRIIkUCI6l3M3d//tT5k9YtfruY3H+/2JAc2+wfM3dnXf6a1d6D7Po6e7ZYxl586UM3PzKxtdPsbVxAigRAJhEggRAIhEgiRQIgEQiQQIoEQCYRIIET1sm7p9JP9Kf1Lqipbp/rF28qH3d/G7U/t1Wfsv94t6y490y3fFvovafaWu6/JFYRIIEQCIRIIkUCIBEIkECKBEAmESCBEAiGqdzF3VvbXh9w+Up7RvU9olm52+4WZmeundqr5hYP94uPjh7sf795K+eKl96vxmZk5cLX7PlxBiARCJBAigRAJhEggRAIhEgiRQIgEQiQQonoXs7vWPzjt7he2qvm9s2vV/O76Pdx4U65vDv/lYH3EXvmjWtzpvqh9N/v7gbYf6H7lriBEAiESCJFAiARCJBAigRAJhEggRAIhEgiRQIjqZd3yxmZ9yM0LD3YfKJdvd+/hpqbF8jPXTyzVZyyUO8R2Wbe42y/rbq97iB2fIIEQCYRIIEQCIRIIkUCIBEIkECKBEAmEqN7FbH7+ofqQhfVb1fxjn/6omj+wtFvNz8x8+5FfV/NvbH+2PuNXF56u5t+7eH81f3lvpZqfmblbXhJcQYgEQiQQIoEQCYRIIEQCIRIIkUCIBEIkEKJ+F/Nk39SJ45er+Xa38v2Tr1TzMzM/ufSVav7aTv8Qu3Nnj1fzC3fKh9j171GahfJWGlcQIoEQCYRIIEQCIRIIkUCIBEIkECKBEAmESCBE9bJu7uHlTifXumXdW5vdQ+9+fPGFan5m5sL2ejV/bad/09bize7vb/Wxq9X8zp+OVvMzM0vb3bwrCJFAiARCJBAigRAJhEggRAIhEgiRQIgEQlTvYtbe619i89pvnq3m7yx3Z3z1+ber+ZmZ18+frOY/99AH9Rk7z3QvIdp481g1v1T+nGZmFm95oRCfIIEQCYRIIEQCIRIIkUCIBEIkECKBEAmEqN7FLNzt//9//enuvpitvz1Qn9HaOr9WzW/et1mfcfnGoWr+4acuVvMbG90LiGZmVi4uV/OuIEQCIRIIkUCIBEIkECKBEAmESCBEAiESCJFAiOpl3fo75VPQZubtd7uHrS2d6M746TtnqvmZmV987UfV/MvnvlmfsX2jW4ztvtE9WO/+f1bjMzOz9Xi3bHUFIRIIkUCIBEIkECKBEAmESCBEAiESCJFAiOpdzNVT3c1AMzOnn/1HNX9puzvjw/f7G4i+8dsXq/lvnf5DfcbGLx+t5g8+/1E1/8BzN6r5mZkrbz5SzbuCEAmESCBEAiESCJFAiARCJBAigRAJhEggRPUuZuXybn1Iu1t57ti5av7xR1+v5mdmXlg9W81/d+Pr9Rm/e+mH1fyZn71UzV/d6h/2d9973bwrCJFAiARCJBAigRAJhEggRAIhEgiRQIgEQiQQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAID/df8GZWT86v0lyFEAAAAASUVORK5CYII=" id="image2e385964e8" transform="scale(1 -1)translate(0 -266.4)" x="40.603125" y="-6.912" width="97.92" height="266.4"/> <image xlink:href="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAABIAAAFyCAYAAAD4TYq5AAAB8UlEQVR4nO2d2w0CMQwEnUtKowT6L4VQgvkYodHKLmCVfdhJjtOxXut9C6iHAKmqOrUYrLOexQBRK+I0CqZWHDUGiHQNSraPWtmoYTkC7acGm48a59oku6tg17KTPTttU0rXdNRsrp0r1IgC8lHLTTaVI8z+YNemafsyJpvBMSb75jbthVwTJpsKZPI8ElLjXGOAspuWARJeIYz2Q0BGauNaBxS80wbPI+roZ6Q2gWyBdN0fPNiGWl/Rlxqba8KZrdNoqPU11P4JFKwR8jZcKan5Rm3wLqKjNhr1JWza5FGbS200agtb0cyjviaQPwBF28+ca5Pt97kWHUgMKDXZPrGVyWbs94ktdM3XtD6NJtk9EPSziPF85Ask17QVq9HKtT+aGgSEdT8ltlAjHzVwO7JRg5gpqXG9huDUeZI18lGDgIJde6CDVrBrwmQPtR+AhMmmqAW7ZgMCNaLutEJqwa7lziPQtQ8FpNMouWkh17aPGucao5Gwafdo1AIZkx07ancwNWyDpJqWWpFxOxIetIZaB0R1v3EXoQabsGmD5xHnmq1pwZ3W5hrYa9i9X6dR8MzOnUfcfU3nGhjI2Kc13GATniFljzSMTat7IwqcR0ydLdQI+o4WtSRyZmPUbB8uAV3TUeNcg/6qwKgRRk32Gf4vtfqN5qlVmUMAAAAASUVORK5CYII=" id="image3f45d39474" transform="scale(1 -1)translate(0 -266.4)" x="156.24" y="-6.48" width="12.96" height="266.4"/>
In [22]:
%matplotlib notebook

from mpl_toolkits.mplot3d import Axes3D

fig = plt.figure()

ax = fig.add_subplot(111, projection='3d')

X, Y = np.meshgrid(np.arange(value_with_ace.shape[0]), 
                   np.arange(value_without_ace.shape[1]))

def get_z(x, y, ace):
    data_source = value_with_ace if ace else value_without_ace
    return value_without_ace[x][y]

Z = np.array([get_z(x,y,ace=False) for x,y in zip(np.ravel(X), np.ravel(Y))]).reshape(X.shape)

ax.scatter(X,Y,Z) 
#ax.plot_surface(X, Y, Z)
#ax.plot_wireframe(X, Y, Z)

ax.set_xlabel("Player Current Sum")
ax.set_ylabel("Dealder Show")
ax.set_zlabel("State Value")

plt.show()

Talking: Shortage of Monte Carlo Methods:¶

  1. computing efficient: if our task is very long term, the collection need can become very huge;
  2. time comsuming;
  3. if its state space is very huge. If we need some traning samples to fit this q_table, it would become a very complicated fitting task. We need lots of data.
$$ argmax_{(s, a)}\begin{pmatrix} (\mathcal{A}, s_1, s_2, s_3, ... s_n \\ a_1,., ., ., . . ., \\ a_2,., ., ., . . ., \\ a_3,., ., ., . . ., \\ a_4,., ., ., . . ., \\ a_x,.. ., ., . . ., \\ a_N,., ., ., . . ., \\ \end{pmatrix} $$

SARSA Temperal-Difference¶

$$ \begin{align} Q(s, a) &= (1 - \alpha) * Q(s, a) + \alpha * Q^{new}(s, a) \\ &= (1 - \alpha) * Q(s, a) + \alpha(r + \gamma Q(s', a_{next})) \\ &= Q(s, a) + \alpha [r + \gamma Q(s', a_{next}) - Q(s, a)] \end{align}$$

temperol: relating to time:

The error betwee $Q(s', a') - Q(s, a)$, we call it temperol difference error

=> TD Learning

$$ \mathbf{dataset} = \begin{pmatrix} s_i, a_i, r_i, s_{i+1}, a_{i+1} \\ s_j, a_i, r_i, s_{i+1}, a_{i+1} \\ s_k, a_k, r_k, s_{k+1}, a_{k+1} \\ s_m, a_m, r_m, s_{m+1}, a_{m+1} \\ s_n, a_n, r_n, s_{n+1}, a_{n+1} \\ \end{pmatrix} $$

We name it: SRASA algorithm

Exploration vs Exploitation¶

while True:
        action = env.action_space.sample()  # agent policy that uses the observation and info
        next_state, reward, terminated, truncated, info = env.step(action)

        trajectory.append( (state, action, reward) )

Exploitation¶

def policy(s): 
    return argmax(q_table[s]) 

while True:
        action = policy(state)  # agent policy that uses the observation and info
        next_state, reward, terminated, truncated, info = env.step(action)

        trajectory.append( (state, action, reward) )

Explore¶

def policy(s):
    epsilon = 0.05

    if random.randon() < epsilon: 
        return random.choice(range(len(q_table[s]))
    else:
        return argmax(q_table[s]) 

while True:
        action = policy(state)  # agent policy that uses the observation and info
        next_state, reward, terminated, truncated, info = env.step(action)

        trajectory.append( (state, action, reward) )

Q-Learning: because of the exploration, SARSA's next (state, action) may not hold the best Q-value.¶

$$ Q(s, a) = Q(s, a) + \alpha * [r + γ * max_{a' \in \mathcal{A}}(Q(s', a')) - Q(s, a)] $$

SARSA: $$ Q(s, a) = Q(s, a) + \alpha [r + \gamma Q(s', a_{next}) - Q(s, a)] $$ Q-LERNING: $$ Q(s, a) = Q(s, a) + \alpha * [r + γ * max_{a' \in \mathcal{A}}(Q(s', a')) - Q(s, a)] $$

off-policy vs. on-policy¶

on-policy: the policy learned depends on the current policy¶

off-policy: the policy leanred not depends on the current policy¶

Deep Reinforcement Learning¶

Deep-Q-Learning¶

image.png

image.png

Algorithm Formula¶

$$ y_j = r_j + \gamma \max_{a'} Q_{\text{target}}(s'_j, a'; \theta^-) $$$$ \quad \theta \leftarrow \theta - \frac{\alpha}{B} \sum_{j} (y_j - Q(s_j, a_j; \theta))^2 $$

Fixed Target and Memory Replay¶

  1. Fixed target $Q_{target}$
  2. Memory Replay: memory buffer

thiking: what benifits we can get by these two methods?¶

step1: Choose action $a$ using $\epsilon$-greedy policy based on Q-network

step2: Execute action $a$ and observe reward $r$ and next state $s'$

step3: Store transition $(s, a, r, s')$ in experience replay buffer

step4: Sample random minibatch of $B$ transitions from replay buffer

step5: FOR each transition $(s_j, a_j, r_j, s'_j)$ in minibatch

Compute target Q-value: $y_j = r_j + \gamma \max_{a'} Q_{\text{target}}(s'_j, a'; \theta^-)$

step6: Update Q-network parameters $\theta$ by minimizing the loss:

$ \quad \theta \leftarrow \theta - \frac{\alpha}{B} \sum_{j} (y_j - Q(s_j, a_j; \theta))^2$

step7: Update target network parameters $\theta^-$ periodically: $\theta^- \leftarrow \theta$

step8: Update exploration rate $\epsilon$ based on the exploration schedule

step9: Accumulate episode reward: $R \leftarrow R + r$

step10: Set current state as next state: $s \leftarrow s'$

Shortage of DeepQ-Learning¶

However, there are some shortages when we get $q(a, s)$ firstly, and use its value get best or $\epsilon$-best actions.

  1. WITHOUT REAL STOCHASTIC: There are so many occasions, such like an imperfectable information game, the best policy may be stochastic, we need to get the precisely probabiliy of each action, not just by greedy or $\epsilon$-greedy.

  2. BRITTLE: $\epsilon$-greedy selection may change dramatically for an arbitrarily small change in the estimated action values.

  3. HARD FOR CONTINUOUS: the $q(a, s)$ function is hard for continuous space and action problem.

  4. INFORMACTION IMPERFECT: because of our sensor limitation, there will be some situations that looks same, but actually are different. In such situations, we need different actions when face the same situations.

Therefore, we need a methods which could get the probability of given state-action pair. For continous space, we could get the value of each index or categorical of actions.

Policy Gradient: How can we get the action probability directly from a policy function with no Q-value¶

$$ a \leftarrow \pi(s) $$

Under the policy $\pi(\theta)$, for an episode, we want maxmize the value of initial state $s_0$, if we define the $J(\theta) = v_{\pi_{\theta}(s_0)} $, our target is: $$argmax_{\theta}{J(\theta)} $$

we use the tracjectories to be a finite episode or continuous episode with horizon H. If we collect so many tracjectories $(\tau^{0},\tau^{1}, \tau^{2},..., \tau^{N})$

$$ \begin{align} \nabla {J(\theta)} &= \nabla {\sum_{\tau} Pr(\tau; \theta) R(\tau)} \\ &= {\sum_{\tau} \nabla {Pr(\tau; \theta) R(\tau)}} \\ &= {\sum_{\tau} \frac{Pr(\tau; \theta)} {Pr(\tau; \theta)} \nabla {Pr(\tau; \theta) R(\tau)}} \\ &= \sum_{\tau} Pr(\tau; \theta) \frac {\nabla_{\theta} Pr(\tau; \theta)} {Pr(\tau; \theta)} R(\tau) \\ &= \frac{1}{m} \sum_{i = 0 }^{m}\nabla_{\theta} In(Pr(\tau; \theta))R(\tau^i) \text{ ; for sampled m tracjectories} \\ &= \frac{1}{m} \sum_{i = 0 }^{m} \nabla_{\theta} In(\prod_{t=0}^{H} Pr(s_{t+1}^{i}| s_{t}^i, a_t^i) \pi_{\theta}(a_t^i | s_{t}^i)) R(\tau^i)\\ &= \frac{1}{m} \sum_{i = 0 }^{m} \nabla_{\theta} \left[{\sum_{t = 0}^{H-1} In(Pr(s_{t+1}^i | s_t^i, a_t^i)) + \sum_{i = 0}^{H} In(\pi_{\theta}(a_t^i | s_t^i))}]R(\tau^i) \right]\\ &= \frac{1}{m} \sum_{i = 0 }^{m} \nabla_{\theta} \left[ \sum_{t = 0}^{H-1} In(Pr(a_t^i | s_t^i)_{\pi(\theta)})]R(\tau^i) \right] \\ &= \frac{1}{m} \sum_{i = 0 }^{m}\sum_{t = 0}^{H-1} \nabla_{\theta}In(Pr(a_t^i | s_t^i)_{\pi_{\theta}})R(\tau^i) \\ &\propto \frac{1}{m} \sum_{i = 0 }^{m}\sum_{t = 0}^{H-1} \nabla_{\theta}log(Pr(a_t^i | s_t^i)_{\pi_{\theta}})R(\tau^i) \end{align} $$

define the $\nabla{J(\theta)}$ as $\hat{g}$, and for each step $t$, we approximate the reward $R(\tau)$ as the future reward $R(\tau^t)$, for a given sample, we get:

$$ \hat{g} = \sum_{t = 0}^{H-1} \nabla_{\theta}log(\pi_{\theta}(a_t^i | s_t^i))\sum_{t=t}^{H}r(s_{i, t}, a_{i, t})$$

Therefore, we could get sample ${s_i, a_i}$ for $\pi(a | s)$:

  1. get the rewards $\sum_{t=t}^{H}r(s_{i, t}, a_{i, t})$
  2. evalutate $\nabla_{\theta}{J(\theta)} = \nabla_{\theta}log(\pi_{\theta}(a_t^i | s_t^i))\sum_{t=t}^{H}r(s_{i, t}, a_{i, t})$
  3. $\theta \leftarrow \theta + \alpha \nabla J_\theta (\theta)$

pytorch distribution¶

image.png

https://pytorch.org/docs/stable/distributions.html

Advantage Function & GAE¶

We call the previous algorithm REINFORCE. The algorithm could start an arbitary $\pi(\theta) $, and based on the update, get better $\theta$.

But, the REINFORCE methods have high variance.

why?

because when we get the $\theta$, we just could get limited actions from limited spaces. We could add some baseline to reduce the variance.

If we use the value function $v(s)$ to be the baseline, we could:

  1. get the action policy $\pi_\theta(\theta)$, we called actor
  2. get the policy evaluation method $v_{w}(w)$, we called critic

because of the $v(s)$ function, we could change our algorithm to an online algorithm, which doesn't need run an episode to get the rewards.

online actor-critic algorithm

  1. take action$ a \sim \pi_{\theta}(a | s) $, get $(s, a, s', r)$
  2. evaluate $\hat{A}^{\pi}(s, a) = r(s, a) + \gamma v_{w}^{\pi}(s') - v_{w}^{\pi}(s)$
  3. $ w \leftarrow \hat{A}^{\pi}(s, a) \nabla{V}(s, w) $
  4. $ \nabla J(\theta) = \nabla log \pi_{\theta}(a | s) \hat{A^{\pi}}(s, a) $
  5. $ \theta \leftarrow \theta + \alpha \nabla_{\theta}J(\theta)$

Importance Sampling¶

When we use the REINFORCE or Actor-Critic methods, we get a sample and use it and drop out it. What if we can use the tracjectories that we collected over and over again, and import our policy?

Remember when we do policy gradient, we want to maximize the $J(\theta) = \mathbf{E}_{(\tau; s_0, a_0, .. )} [\sum_{t = 0} ^ {\infty} \gamma^t r(s_t)]$

If there are a new policy $\pi(\theta')$ , we take the actions are sampled from $\pi(\theta')$ to get tracjectory $\tau $

$$ \begin{align} \because A_{\pi{\theta}} &= {E}_{s' \sim Pr(s' | s, a)}[r(s) + \gamma V_{\pi(\theta)}(s') - V_{\pi(\theta)}(s)] \\ \therefore E_{\tau | \pi(\theta')}[\sum_{t = 0} ^ {\infty} \gamma^{t} A_{\pi{\theta}}] &= E_{\tau | \pi(\theta')} [ \sum_{t = 0}^{\infty} \gamma^{t}(r(s) + \gamma V_{\pi(\theta)}(s') - V_{\pi(\theta)}(s))] \\ &= E_{\tau | \pi(\theta')} \left[ \sum_{t = 0}^{\infty} (\gamma^{t} r(s) + [\gamma^t * \gamma V_{\pi(\theta)}(s_{t+1)}) - \gamma^t V_{\pi}(s_t))]) \right] \\ &=E_{\tau | \pi(\theta')} \left[ \sum_{t = 0}^{\infty} (\gamma^{t} r(s)) - V_{\pi(\theta)}(s_0) \right] \\ &=J(\theta') - E_{\tau | \pi(\theta')}V_{\pi(\theta)}(s_0)\\ &=J(\theta') - E_{s_o}[V_{\pi(\theta)}(s_0)] \text{; because distribute $s_0$ is independent of $\theta$}\\ &=J(\theta') - J(\theta) \end{align} $$

Rewrite it as:

$$ J(\theta') - J(\theta) = E_{\tau | \pi(\theta')}[\sum_{t = 0} ^ {\infty} \gamma^{t} A_{\pi{\theta}}] $$

Because the $\tau$ is consist on (action, state), based on the margin distribution

$$ \begin{align} J(\theta') - J(\theta) &= E_{\tau | \pi(\theta')}[\sum_t \gamma^t A_{\pi{\theta}}] \\ &= \sum_{t} E_{s_t \sim p(\theta)} [E_{a_t \sim \pi(\theta'(a_t | s_t))}[\gamma^{t} A_{\pi{\theta}} (s_t, a_t)]] \end{align} $$

based on the sample importance:

$$\begin{align} E_{x \sim p(x) [f(x)]} &= \int p(x) f(x) dx \\ &= \int \frac{q(x)}{q(x)} p(x) f(x)dx \\ &= \int q(x)\frac{p(x)}{q(x)}f(x) dx \\ &= E_{x \sim q(x)} \left[\frac{p(x)}{q(x)}f(x) \right] \end{align}$$
$$\begin{align} J(\theta') - J(\theta) &= E_{\tau | \pi(\theta')}[\sum_t \gamma^t A_{\pi{\theta}}] \\ &= \sum_{t} E_{s_t \sim p(\theta')} [E_{a_t \sim \pi(\theta'(a_t | s_t))}[\gamma^{t} A_{\pi{\theta}} (s_t, a_t)]] \\ &= \sum_{t} E_{s_t \sim p(\theta')} \left[ E_{a_t \sim \pi_{\theta}}(a_t | s_t) \left[ \frac{\pi_{\theta'}(a_t, s_t)}{\pi_{\theta}(a_t, s_t)} \gamma^{t} A_{\pi{\theta}} (s_t, a_t) \right] \right] \end{align}$$

And, we know the $\pi(\theta)$, therefore, we could get the $E_{a_t \sim \pi_{ (\theta) } (a_t | s_t)}$ value

current now, we did not know the $E_{s_t \sim p(\theta')}$, because the $\theta'$ is the new policy paramters we want get.

However, if we could approaximate the $p(\theta')$ using $\theta$, we could update the $J(\theta)$ to $J(\theta')$ over and over again, just using the $\theta$, which we already know, and tracjectories we already have collected.

It is to say, if we could make fellow be true: $$\begin{align} J(\theta') - J(\theta) &= \sum_{t} E_{s_t \sim p(\theta')} \left[ E_{a_t \sim \pi_{\theta}}(a_t | s_t) \left[ \frac{\pi_{\theta'}(a_t, s_t)}{\pi_{\theta}(a_t, s_t)} \gamma^{t} A_{\pi{\theta}} (s_t, a_t) \right] \right] \\ &\approx \sum_{t} E_{s_t \sim p(\theta)} \left[ E_{a_t \sim \pi_{\theta}}(a_t | s_t) \left[ \frac{\pi_{\theta'}(a_t, s_t)}{\pi_{\theta}(a_t, s_t)} \gamma^{t} A_{\pi{\theta}} (s_t, a_t) \right] \right] \end{align}$$ if we define $\bar{A}(\theta') = \sum_{t} E_{s_t \sim p(\theta)} \left[ E_{a_t \sim \pi_{\theta}}(a_t | s_t) \left[ \frac{\pi_{\theta'}(a_t, s_t)}{\pi_{\theta}(a_t, s_t)} \gamma^{t} A_{\pi{\theta}} (s_t, a_t) \right] \right] $

We could use the:

$$\theta' \leftarrow argmax_{\theta'} \bar{A}(\theta')$$

to update $J(\theta)$ as quick as possible:

$$ J(\theta') - J(\theta) \approx \bar{A}(\theta')$$

Current now, we meet a new question, when $E_{s_t \sim p(\theta')} \approx E_{s_t \sim p(\theta')}$

By mathematical deduction, RL: lecture-14¶

We can get $J(\theta')$ with sampled importance:

$$\begin{align} &\sum_{t} E_{s_t \sim p(\theta')} \left[ E_{a_t \sim \pi_{\theta}}(a_t | s_t) \left[ \frac{\pi_{\theta'}(a_t, s_t)}{\pi_{\theta}(a_t, s_t)} \gamma^{t} A_{\pi{\theta}} (s_t, a_t) \right] \right] \geq \\ &\sum_{t} E_{s_t \sim p(\theta)} \left[ E_{a_t \sim \pi_{\theta}}(a_t | s_t) \left[ \frac{\pi_{\theta'}(a_t, s_t)}{\pi_{\theta}(a_t, s_t)} \gamma^{t} A_{\pi{\theta}} (s_t, a_t) \right] \right] - \sum_{t} 2 \epsilon t C \end{align}$$

PPO: Proximal Policy Optimization¶

When using $\mathcal{L}(\theta') = E \left[ \frac{\pi_{\theta'}(a_t, s_t)} {\pi_{\theta}(a_t, s_t)} A_{\pi{\theta}} (s_t, a_t) \right]$ for one step $t$,

the $\frac{\pi_{\theta'}(a_t, s_t)} {\pi_{\theta}(a_t, s_t)} $ may to large and get the update too dramatically. In order to solve this problem, the paper proposed two methods to solve this problem:

PPO-Clipped Surrogate Objective (usually use it!)¶

This methods change the $\mathcal{L}$ to a clipped loss function. let $r_{t}(\theta) = \frac{\pi_{\theta'}(a_t, s_t)} {\pi_{\theta}(a_t, s_t)}$:

$$\mathcal{L}^{CLIP}(\theta) = \mathbf{E}_t\left[ min(r_{t}(\theta) \hat{A_t}, clip(r_t{\theta}, 1 - \epsilon, 1 + \epsilon)\hat{A_t})\right]$$

This method could let the updating not to small or to large. Could make the converagence more stable.

Adaptive KL Penalty Coefficient¶

$$ \mathcal{L}^{KLPEN}(\theta) = \mathbf{E}_t \left[ \frac{\pi_{\theta'}(a_t, s_t)} {\pi_{\theta}(a_t, s_t)} A_{t, \pi{\theta}}\right] - \beta D_{KL} \left[ \pi_{\theta}{(\cdot | s_t)}, \pi_{\theta'}{(\cdot | s_t)}\right]$$

with $$D_{KL}(p | q) = \sum_{i \in X} \left( log \frac{p(i)} {q(i)} p(i)\right)$$

Compute $d = \mathbf{E}_t D_{KL} \left[ \pi_{\theta}{(\cdot | s_t)}, \pi_{\theta'}{(\cdot | s_t)} \right]$

  • if $ d < d_{targ} / 1.5, \beta \leftarrow \beta / 2$
  • if $ d > d_{targ} * 1.5, \beta \leftarrow \beta * 2$

Talking: What's relationship between Policy Gradient and PPO algorithm?¶

Source code analysis: https://github.com/nikhilbarhate99/PPO-PyTorch/blob/master/PPO.py¶

more detailed implementation: https://github.com/openai/baselines/tree/master/baselines/ppo2¶

Homework:¶

welcome to share your answer in our Learning GROUP; or you can send email to minchiuan.gao@gmail.com, I will reply you personally.

Please Answer these questions:¶

  1. If we use supervised learning methods to optimize the long-term sequential problem, what will happen?

ans:

  1. What's the advantage and disadvantage in Monte Carlo RL methods?

ans:

  1. Can you propose more situations that are not satisfied with Markov Property*?

ans:

  1. Monte Carlo, SARSA, Q-Learning, DeepQ-Learning, Policy Gradient, PPO, can you write a summary to explain why we are using PPO algorithms now? In different periods, what we mainly want to sovle the previous algorithm?

ans:

  1. What if we don't know the reward in a RL task? How can we still learn something?

ans:

  1. Current now, we use deep learning method to solve RL problems. What shortages exist in this paradigm do you think? (long-term learning, explaination ability, stability... )

ans:

2. (optional) coding exerise, you another jupyter notebook, we supply a policy gradient (REINFORCE) training algorithm. There are two important TODO you need to write by yourself.¶